Sample records for computer simulation reveals

  1. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  2. Cloud-based simulations on Google Exacycle reveal ligand modulation of GPCR activation pathways

    NASA Astrophysics Data System (ADS)

    Kohlhoff, Kai J.; Shukla, Diwakar; Lawrenz, Morgan; Bowman, Gregory R.; Konerding, David E.; Belov, Dan; Altman, Russ B.; Pande, Vijay S.

    2014-01-01

    Simulations can provide tremendous insight into the atomistic details of biological mechanisms, but micro- to millisecond timescales are historically only accessible on dedicated supercomputers. We demonstrate that cloud computing is a viable alternative that brings long-timescale processes within reach of a broader community. We used Google's Exacycle cloud-computing platform to simulate two milliseconds of dynamics of a major drug target, the G-protein-coupled receptor β2AR. Markov state models aggregate independent simulations into a single statistical model that is validated by previous computational and experimental results. Moreover, our models provide an atomistic description of the activation of a G-protein-coupled receptor and reveal multiple activation pathways. Agonists and inverse agonists interact differentially with these pathways, with profound implications for drug design.

  3. Software for Brain Network Simulations: A Comparative Study

    PubMed Central

    Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.

    2017-01-01

    Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package. The results indicate that BRIAN provides the most concise language for both cases considered. Furthermore, as expected, NEST mostly favors large network models, while NEURON is better suited for detailed models. Overall, the case studies reinforce our general observation that simulators have a bias in the computational performance toward specific types of the brain network models. PMID:28775687

  4. Computer simulation for integrated pest management of spruce budworms

    Treesearch

    Carroll B. Williams; Patrick J. Shea

    1982-01-01

    Some field studies of the effects of various insecticides on the spruce budworm (Choristoneura sp.) and their parasites have shown severe suppression of host (budworm) populations and increased parasitism after treatment. Computer simulation using hypothetical models of spruce budworm-parasite systems based on these field data revealed that (1)...

  5. In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway

    NASA Astrophysics Data System (ADS)

    Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun

    2016-12-01

    HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.

  6. The Exponential Expansion of Simulation: How Simulation has Grown as a Research Tool

    DTIC Science & Technology

    2012-09-01

    exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently

  7. Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy

    PubMed Central

    Ackermann, Marko; van den Bogert, Antonie J.

    2012-01-01

    The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. PMID:22365845

  8. Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy.

    PubMed

    Ackermann, Marko; van den Bogert, Antonie J

    2012-04-30

    The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Understanding Islamist political violence through computational social simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less

  10. The Exponential Expansion of Simulation in Research

    DTIC Science & Technology

    2012-12-01

    exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently

  11. Using and Evaluating Resampling Simulations in SPSS and Excel.

    ERIC Educational Resources Information Center

    Smith, Brad

    2003-01-01

    Describes and evaluates three computer-assisted simulations used with Statistical Package for the Social Sciences (SPSS) and Microsoft Excel. Designed the simulations to reinforce and enhance student understanding of sampling distributions, confidence intervals, and significance tests. Reports evaluations revealed improved student comprehension of…

  12. FACE computer simulation. [Flexible Arm Controls Experiment

    NASA Technical Reports Server (NTRS)

    Sadeh, Willy Z.; Szmyd, Jeffrey A.

    1990-01-01

    A computer simulation of the FACE (Flexible Arm Controls Experiment) was conducted to assess its design for use in the Space Shuttle. The FACE is supposed to be a 14-ft long articulate structure with 4 degrees of freedom, consisting of shoulder pitch and yaw, elbow pitch, and wrist pitch. Kinematics of the FACE was simulated to obtain data on arm operation, function, workspace and interaction. Payload capture ability was modeled. The simulation indicates the capability for detailed kinematic simulation and payload capture ability analysis, and the feasibility of real-time simulation was determined. In addition, the potential for interactive real-time training through integration of the simulation with various interface controllers was revealed. At this stage, the flexibility of the arm was not yet considered.

  13. Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.

    PubMed

    Drusbosky, Leylah M; Cogle, Christopher R

    2017-10-01

    This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.

  14. Study of extracerebral contamination for three cerebral oximeters by Monte Carlo simulation using CT data

    NASA Astrophysics Data System (ADS)

    Tarasov, A. P.; Egorov, A. I.; Rogatkin, D. A.

    2017-07-01

    Using multidetector computed tomography, thicknesses of bone squame and soft tissues of human head were assessed. MC simulation revealed impropriety of source-detector separation distances for 3 oximeters, which can cause extracerebral contamination.

  15. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  16. Atmospheric simulation using a liquid crystal wavefront-controlling device

    NASA Astrophysics Data System (ADS)

    Brooks, Matthew R.; Goda, Matthew E.

    2004-10-01

    Test and evaluation of laser warning devices is important due to the increased use of laser devices in aerial applications. This research consists of an atmospheric aberrating system to enable in-lab testing of various detectors and sensors. This system employs laser light at 632.8nm from a Helium-Neon source and a spatial light modulator (SLM) to cause phase changes using a birefringent liquid crystal material. Measuring outgoing radiation from the SLM using a CCD targetboard and Shack-Hartmann wavefront sensor reveals an acceptable resemblance of system output to expected atmospheric theory. Over three turbulence scenarios, an error analysis reveals that turbulence data matches theory. A wave optics computer simulation is created analogous to the lab-bench design. Phase data, intensity data, and a computer simulation affirm lab-bench results so that the aberrating SLM system can be operated confidently.

  17. Development of a computer-simulation model for a plant-nematode system.

    PubMed

    Ferris, H

    1976-07-01

    A computer-simulation model (MELSIM) of a Meloidogyne-grapevine system is developed. The objective is to attempt a holistic approach to the study of nematode population dynamics by using experimental data from controlled environmental conditions. A simulator with predictive ability would be useful in considering pest management alternatives and in teaching. Rates of flow and interaction between the components of the system are governed by environmental conditions. Equations for these rates are determined by fitting curves to data from controlled environment studies. Development of the model and trial simulations have revealed deficiencies in understanding of the system and identified areas where further research is necessary.

  18. An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Carter, M. C.; Madison, M. W.

    1973-01-01

    The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

  19. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less

  20. Relationship of the interplanetary electric field to the high-latitude ionospheric electric field and currents Observations and model simulation

    NASA Technical Reports Server (NTRS)

    Clauer, C. R.; Banks, P. M.

    1986-01-01

    The electrical coupling between the solar wind, magnetosphere, and ionosphere is studied. The coupling is analyzed using observations of high-latitude ion convection measured by the Sondre Stromfjord radar in Greenland and a computer simulation. The computer simulation calculates the ionospheric electric potential distribution for a given configuration of field-aligned currents and conductivity distribution. The technique for measuring F-region in velocities at high time resolution over a large range of latitudes is described. Variations in the currents on ionospheric plasma convection are examined using a model of field-aligned currents linking the solar wind with the dayside, high-latitude ionosphere. The data reveal that high-latitude ionospheric convection patterns, electric fields, and field-aligned currents are dependent on IMF orientation; it is observed that the electric field, which drives the F-region plasma curve, responds within about 14 minutes to IMF variations in the magnetopause. Comparisons of the simulated plasma convection with the ion velocity measurements reveal good correlation between the data.

  1. An empirical analysis of the distribution of the duration of overshoots in a stationary gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Parrish, R. S.; Carter, M. C.

    1974-01-01

    This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.

  2. CFD: A Castle in the Sand?

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Wood, William A.

    2004-01-01

    The computational simulation community is not routinely publishing independently verifiable tests to accompany new models or algorithms. A survey reveals that only 22% of new models published are accompanied by tests suitable for independently verifying the new model. As the community develops larger codes with increased functionality, and hence increased complexity in terms of the number of building block components and their interactions, it becomes prohibitively expensive for each development group to derive the appropriate tests for each component. Therefore, the computational simulation community is building its collective castle on a very shaky foundation of components with unpublished and unrepeatable verification tests. The computational simulation community needs to begin publishing component level verification tests before the tide of complexity undermines its foundation.

  3. Fractional charge revealed in computer simulations of resonant tunneling in the fractional quantum Hall regime.

    PubMed

    Tsiper, E V

    2006-08-18

    The concept of fractional charge is central to the theory of the fractional quantum Hall effect. Here I use exact diagonalization as well as configuration space renormalization to study finite clusters which are large enough to contain two independent edges. I analyze the conditions of resonant tunneling between the two edges. The "computer experiment" reveals a periodic sequence of resonant tunneling events consistent with the experimentally observed fractional quantization of electric charge in units of e/3 and e/5.

  4. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  5. Computational Investigation of Fluidic Counterflow Thrust Vectoring

    NASA Technical Reports Server (NTRS)

    Hunter, Craig A.; Deere, Karen A.

    1999-01-01

    A computational study of fluidic counterflow thrust vectoring has been conducted. Two-dimensional numerical simulations were run using the computational fluid dynamics code PAB3D with two-equation turbulence closure and linear Reynolds stress modeling. For validation, computational results were compared to experimental data obtained at the NASA Langley Jet Exit Test Facility. In general, computational results were in good agreement with experimental performance data, indicating that efficient thrust vectoring can be obtained with low secondary flow requirements (less than 1% of the primary flow). An examination of the computational flowfield has revealed new details about the generation of a countercurrent shear layer, its relation to secondary suction, and its role in thrust vectoring. In addition to providing new information about the physics of counterflow thrust vectoring, this work appears to be the first documented attempt to simulate the counterflow thrust vectoring problem using computational fluid dynamics.

  6. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    PubMed

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  7. Examining Reactions to Employer Information Using a Simulated Web-Based Job Fair

    ERIC Educational Resources Information Center

    Highhouse, Scott; Stanton, Jeffrey M.; Reeve, Charlie L.

    2004-01-01

    The approach taken in the present investigation was to examine reactions to positive and negative employer information by eliciting online (i.e., moment-to-moment) reactions in a simulated computer-based job fair. Reactions to positive and negative information commonly reveal a negatively biased asymmetry. Positively biased asymmetries have been…

  8. Teaching Markov Chain Monte Carlo: Revealing the Basic Ideas behind the Algorithm

    ERIC Educational Resources Information Center

    Stewart, Wayne; Stewart, Sepideh

    2014-01-01

    For many scientists, researchers and students Markov chain Monte Carlo (MCMC) simulation is an important and necessary tool to perform Bayesian analyses. The simulation is often presented as a mathematical algorithm and then translated into an appropriate computer program. However, this can result in overlooking the fundamental and deeper…

  9. Heterogeneity in homogeneous nucleation from billion-atom molecular dynamics simulation of solidification of pure metal.

    PubMed

    Shibuta, Yasushi; Sakane, Shinji; Miyoshi, Eisuke; Okita, Shin; Takaki, Tomohiro; Ohno, Munekazu

    2017-04-05

    Can completely homogeneous nucleation occur? Large scale molecular dynamics simulations performed on a graphics-processing-unit rich supercomputer can shed light on this long-standing issue. Here, a billion-atom molecular dynamics simulation of homogeneous nucleation from an undercooled iron melt reveals that some satellite-like small grains surrounding previously formed large grains exist in the middle of the nucleation process, which are not distributed uniformly. At the same time, grains with a twin boundary are formed by heterogeneous nucleation from the surface of the previously formed grains. The local heterogeneity in the distribution of grains is caused by the local accumulation of the icosahedral structure in the undercooled melt near the previously formed grains. This insight is mainly attributable to the multi-graphics processing unit parallel computation combined with the rapid progress in high-performance computational environments.Nucleation is a fundamental physical process, however it is a long-standing issue whether completely homogeneous nucleation can occur. Here the authors reveal, via a billion-atom molecular dynamics simulation, that local heterogeneity exists during homogeneous nucleation in an undercooled iron melt.

  10. Molecular dynamics simulations of membrane proteins and their interactions: from nanoscale to mesoscale.

    PubMed

    Chavent, Matthieu; Duncan, Anna L; Sansom, Mark Sp

    2016-10-01

    Molecular dynamics simulations provide a computational tool to probe membrane proteins and systems at length scales ranging from nanometers to close to a micrometer, and on microsecond timescales. All atom and coarse-grained simulations may be used to explore in detail the interactions of membrane proteins and specific lipids, yielding predictions of lipid binding sites in good agreement with available structural data. Building on the success of protein-lipid interaction simulations, larger scale simulations reveal crowding and clustering of proteins, resulting in slow and anomalous diffusional dynamics, within realistic models of cell membranes. Current methods allow near atomic resolution simulations of small membrane organelles, and of enveloped viruses to be performed, revealing key aspects of their structure and functionally important dynamics. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. Self-assembly of micelles in organic solutions of lecithin and bile salt: Mesoscale computer simulation

    NASA Astrophysics Data System (ADS)

    Markina, A.; Ivanov, V.; Komarov, P.; Khokhlov, A.; Tung, S.-H.

    2016-11-01

    We propose a coarse-grained model for studying the effects of adding bile salt to lecithin organosols by means of computer simulation. This model allows us to reveal the mechanisms of experimentally observed increasing of viscosity upon increasing the bile salt concentration. We show that increasing the bile salt to lecithin molar ratio induces the growth of elongated micelles of ellipsoidal and cylindrical shape due to incorporation of disklike bile salt molecules. These wormlike micelles can entangle into transient network displaying perceptible viscoelastic properties.

  12. Simulation of biochemical reactions with time-dependent rates by the rejection-based algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento, Trento

    We address the problem of simulating biochemical reaction networks with time-dependent rates and propose a new algorithm based on our rejection-based stochastic simulation algorithm (RSSA) [Thanh et al., J. Chem. Phys. 141(13), 134116 (2014)]. The computation for selecting next reaction firings by our time-dependent RSSA (tRSSA) is computationally efficient. Furthermore, the generated trajectory is exact by exploiting the rejection-based mechanism. We benchmark tRSSA on different biological systems with varying forms of reaction rates to demonstrate its applicability and efficiency. We reveal that for nontrivial cases, the selection of reaction firings in existing algorithms introduces approximations because the integration of reactionmore » rates is very computationally demanding and simplifying assumptions are introduced. The selection of the next reaction firing by our approach is easier while preserving the exactness.« less

  13. Large-eddy simulations of compressible convection on massively parallel computers. [stellar physics

    NASA Technical Reports Server (NTRS)

    Xie, Xin; Toomre, Juri

    1993-01-01

    We report preliminary implementation of the large-eddy simulation (LES) technique in 2D simulations of compressible convection carried out on the CM-2 massively parallel computer. The convective flow fields in our simulations possess structures similar to those found in a number of direct simulations, with roll-like flows coherent across the entire depth of the layer that spans several density scale heights. Our detailed assessment of the effects of various subgrid scale (SGS) terms reveals that they may affect the gross character of convection. Yet, somewhat surprisingly, we find that our LES solutions, and another in which the SGS terms are turned off, only show modest differences. The resulting 2D flows realized here are rather laminar in character, and achieving substantial turbulence may require stronger forcing and less dissipation.

  14. Contextuality and Wigner-function negativity in qubit quantum computation

    NASA Astrophysics Data System (ADS)

    Raussendorf, Robert; Browne, Dan E.; Delfosse, Nicolas; Okay, Cihan; Bermejo-Vega, Juan

    2017-05-01

    We describe schemes of quantum computation with magic states on qubits for which contextuality and negativity of the Wigner function are necessary resources possessed by the magic states. These schemes satisfy a constraint. Namely, the non-negativity of Wigner functions must be preserved under all available measurement operations. Furthermore, we identify stringent consistency conditions on such computational schemes, revealing the general structure by which negativity of Wigner functions, hardness of classical simulation of the computation, and contextuality are connected.

  15. Effects on Training Using Illumination in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian

    1999-01-01

    Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.

  16. Computational study of noise in a large signal transduction network.

    PubMed

    Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena

    2011-06-21

    Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.

  17. Investigation of different modeling approaches for computational fluid dynamics simulation of high-pressure rocket combustors

    NASA Astrophysics Data System (ADS)

    Ivancic, B.; Riedmann, H.; Frey, M.; Knab, O.; Karl, S.; Hannemann, K.

    2016-07-01

    The paper summarizes technical results and first highlights of the cooperation between DLR and Airbus Defence and Space (DS) within the work package "CFD Modeling of Combustion Chamber Processes" conducted in the frame of the Propulsion 2020 Project. Within the addressed work package, DLR Göttingen and Airbus DS Ottobrunn have identified several test cases where adequate test data are available and which can be used for proper validation of the computational fluid dynamics (CFD) tools. In this paper, the first test case, the Penn State chamber (RCM1), is discussed. Presenting the simulation results from three different tools, it is shown that the test case can be computed properly with steady-state Reynolds-averaged Navier-Stokes (RANS) approaches. The achieved simulation results reproduce the measured wall heat flux as an important validation parameter very well but also reveal some inconsistencies in the test data which are addressed in this paper.

  18. Computational Simulation of Thermal and Spattering Phenomena and Microstructure in Selective Laser Melting of Inconel 625

    NASA Astrophysics Data System (ADS)

    Özel, Tuğrul; Arısoy, Yiğit M.; Criales, Luis E.

    Computational modelling of Laser Powder Bed Fusion (L-PBF) processes such as Selective laser Melting (SLM) can reveal information that is hard to obtain or unobtainable by in-situ experimental measurements. A 3D thermal field that is not visible by the thermal camera can be obtained by solving the 3D heat transfer problem. Furthermore, microstructural modelling can be used to predict the quality and mechanical properties of the product. In this paper, a nonlinear 3D Finite Element Method based computational code is developed to simulate the SLM process with different process parameters such as laser power and scan velocity. The code is further improved by utilizing an in-situ thermal camera recording to predict spattering which is in turn included as a stochastic heat loss. Then, thermal gradients extracted from the simulations applied to predict growth directions in the resulting microstructure.

  19. The effectiveness of using computer simulated experiments on junior high students' understanding of the volume displacement concept

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soon; Gennaro, Eugene

    Several researchers have suggested that the computer holds much promise as a tool for science teachers for use in their classrooms (Bork, 1979, Lunetta & Hofstein, 1981). It also has been said that there needs to be more research in determining the effectiveness of computer software (Tinker, 1983).This study compared the effectiveness of microcomputer simulated experiences with that of parallel instruction involving hands-on laboratory experiences for teaching the concept of volume displacement to junior high school students. This study also assessed the differential effect on students' understanding of the volume displacement concept using sex of the students as another independent variable. In addition, it compared the degree of retention, after 45 days, of both treatment groups.It was found that computer simulated experiences were as effective as hands-on laboratory experiences, and that males, having had hands-on laboratory experiences, performed better on the posttest than females having had the hands-on laboratory experiences. There were no significant differences in performance when comparing males with females using the computer simulation in the learning of the displacement concept. This study also showed that there were no significant differences in the retention levels when the retention scores of the computer simulation groups were compared to those that had the hands-on laboratory experiences. However, an ANOVA of the retention test scores revealed that males in both treatment conditions retained knowledge of volume displacement better than females.

  20. Macrosegregation Resulting from Directional Solidification Through an Abrupt Change in Cross-Sections

    NASA Technical Reports Server (NTRS)

    Lauer, M.; Poirier, D. R.; Ghods, M.; Tewari, S. N.; Grugel, R. N.

    2017-01-01

    Simulations of the directional solidification of two hypoeutectic alloys (Al-7Si alloy and Al-19Cu) and resulting macrosegregation patterns are presented. The casting geometries include abrupt changes in cross-section from a larger width of 9.5 mm to a narrower 3.2 mm width then through an expansion back to a width of 9.5 mm. The alloys were chosen as model alloys because they have similar solidification shrinkages, but the effect of Cu on changing the density of the liquid alloy is about an order of magnitude greater than that of Si. The simulations compare well with experimental castings that were directionally solidified in a graphite mold in a Bridgman furnace. In addition to the simulations of the directional solidification in graphite molds, some simulations were effected for solidification in an alumina mold. This study showed that the mold must be included in numerical simulations of directional solidification because of its effect on the temperature field and solidification. For the model alloys used for the study, the simulations clearly show the interaction of the convection field with the solidifying alloys to produce a macrosegregation pattern known as "steepling" in sections with a uniform width. Details of the complex convection- and segregation-patterns at both the contraction and expansion of the cross-sectional area are revealed by the computer simulations. The convection and solidification through the expansions suggest a possible mechanism for the formation of stray grains. The computer simulations and the experimental castings have been part of on-going ground-based research with the goal of providing necessary background for eventual experiments aboard the ISS. For casting practitioners, the results of the simulations demonstrate that computer simulations should be applied to reveal interactions between alloy solidification properties, solidification conditions, and mold geometries on macrosegregation. The simulations also presents the possibility of engineering the mold-material to avoid, or mitigate, the effects of thermosolutal convection and macrosegregation by selecting a mold material with suitable thermal properties, especially its thermal conductivity.

  1. The Structure and Properties of Silica Glass Nanostructures using Novel Computational Systems

    NASA Astrophysics Data System (ADS)

    Doblack, Benjamin N.

    The structure and properties of silica glass nanostructures are examined using computational methods in this work. Standard synthesis methods of silica and its associated material properties are first discussed in brief. A review of prior experiments on this amorphous material is also presented. Background and methodology for the simulation of mechanical tests on amorphous bulk silica and nanostructures are later presented. A new computational system for the accurate and fast simulation of silica glass is also presented, using an appropriate interatomic potential for this material within the open-source molecular dynamics computer program LAMMPS. This alternative computational method uses modern graphics processors, Nvidia CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model select materials, this enhancement allows the addition of accelerated molecular dynamics simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal of this project is to investigate the structure and size dependent mechanical properties of silica glass nanohelical structures under tensile MD conditions using the innovative computational system. Specifically, silica nanoribbons and nanosprings are evaluated which revealed unique size dependent elastic moduli when compared to the bulk material. For the nanoribbons, the tensile behavior differed widely between the models simulated, with distinct characteristic extended elastic regions. In the case of the nanosprings simulated, more clear trends are observed. In particular, larger nanospring wire cross-sectional radii (r) lead to larger Young's moduli, while larger helical diameters (2R) resulted in smaller Young's moduli. Structural transformations and theoretical models are also analyzed to identify possible factors which might affect the mechanical response of silica nanostructures under tension. The work presented outlines an innovative simulation methodology, and discusses how results can be validated against prior experimental and simulation findings. The ultimate goal is to develop new computational methods for the study of nanostructures which will make the field of materials science more accessible, cost effective and efficient.

  2. Intrinsic map dynamics exploration for uncharted effective free-energy landscapes

    PubMed Central

    Covino, Roberto; Coifman, Ronald R.; Gear, C. William; Georgiou, Anastasia S.; Kevrekidis, Ioannis G.

    2017-01-01

    We describe and implement a computer-assisted approach for accelerating the exploration of uncharted effective free-energy surfaces (FESs). More generally, the aim is the extraction of coarse-grained, macroscopic information from stochastic or atomistic simulations, such as molecular dynamics (MD). The approach functionally links the MD simulator with nonlinear manifold learning techniques. The added value comes from biasing the simulator toward unexplored phase-space regions by exploiting the smoothness of the gradually revealed intrinsic low-dimensional geometry of the FES. PMID:28634293

  3. Growth Dynamics of Information Search Services

    ERIC Educational Resources Information Center

    Lindquist, Mats G.

    1978-01-01

    An analysis of computer-based search services (ISSs) from a system's viewpoint, using a continuous simulation model to reveal growth and stagnation of a typical system is presented, as well as an analysis of decision making for an ISS. (Author/MBR)

  4. Scaling a Convection-Resolving RCM to Near-Global Scales

    NASA Astrophysics Data System (ADS)

    Leutwyler, D.; Fuhrer, O.; Chadha, T.; Kwasniewski, G.; Hoefler, T.; Lapillonne, X.; Lüthi, D.; Osuna, C.; Schar, C.; Schulthess, T. C.; Vogt, H.

    2017-12-01

    In the recent years, first decade-long kilometer-scale resolution RCM simulations have been performed on continental-scale computational domains. However, the size of the planet Earth is still an order of magnitude larger and thus the computational implications of performing global climate simulations at this resolution are challenging. We explore the gap between the currently established RCM simulations and global simulations by scaling the GPU accelerated version of the COSMO model to a near-global computational domain. To this end, the evolution of an idealized moist baroclinic wave has been simulated over the course of 10 days with a grid spacing of up to 930 m. The computational mesh employs 36'000 x 16'001 x 60 grid points and covers 98.4% of the planet's surface. The code shows perfect weak scaling up to 4'888 Nodes of the Piz Daint supercomputer and yields 0.043 simulated years per day (SYPD) which is approximately one seventh of the 0.2-0.3 SYPD required to conduct AMIP-type simulations. However, at half the resolution (1.9 km) we've observed 0.23 SYPD. Besides formation of frontal precipitating systems containing embedded explicitly-resolved convective motions, the simulations reveal a secondary instability that leads to cut-off warm-core cyclonic vortices in the cyclone's core, once the grid spacing is refined to the kilometer scale. The explicit representation of embedded moist convection and the representation of the previously unresolved instabilities exhibit a physically different behavior in comparison to coarser-resolution simulations. The study demonstrates that global climate simulations using kilometer-scale resolution are imminent and serves as a baseline benchmark for global climate model applications and future exascale supercomputing systems.

  5. Simulation of turbulent separated flows using a novel, evolution-based, eddy-viscosity formulation

    NASA Astrophysics Data System (ADS)

    Castellucci, Paul

    Currently, there exists a lack of confidence in the computational simulation of turbulent separated flows at large Reynolds numbers. The most accurate methods available are too computationally costly to use in engineering applications. Thus, inexpensive models, developed using the Reynolds-averaged Navier-Stokes (RANS) equations, are often extended beyond their applicability. Although these methods will often reproduce integrated quantities within engineering tolerances, such metrics are often insensitive to details within a separated wake, and therefore, poor indicators of simulation fidelity. Using concepts borrowed from large-eddy simulation (LES), a two-equation RANS model is modified to simulate the turbulent wake behind a circular cylinder. This modification involves the computation of one additional scalar field, adding very little to the overall computational cost. When properly inserted into the baseline RANS model, this modification mimics LES in the separated wake, yet reverts to the unmodified form at the cylinder surface. In this manner, superior predictive capability may be achieved without the additional cost of fine spatial resolution associated with LES near solid boundaries. Simulations using modified and baseline RANS models are benchmarked against both LES and experimental data for a circular cylinder wake at Reynolds number 3900. In addition, the computational tool used in this investigation is subject to verification via the Method of Manufactured Solutions. Post-processing of the resultant flow fields includes both mean value and triple-decomposition analysis. These results reveal substantial improvements using the modified system and appear to drive the baseline wake solution toward that of LES, as intended.

  6. Computational Aerodynamic Simulations of a 1215 ft/sec Tip Speed Transonic Fan System Model for Acoustic Methods Assessment and Development

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of a 1215 ft/sec tip speed transonic fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, low-noise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15-foot Low Speed Wind Tunnel at the NASA Glenn Research Center. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating points simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, which for this model did not include a split flow path with core and bypass ducts. As a result, it was only necessary to adjust fan rotational speed in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. Computed blade row flow fields at all fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the flow fields at all operating conditions reveals no excessive boundary layer separations or related secondary-flow problems.

  7. Quantitative computational infrared imaging of buoyant diffusion flames

    NASA Astrophysics Data System (ADS)

    Newale, Ashish S.

    Studies of infrared radiation from turbulent buoyant diffusion flames impinging on structural elements have applications to the development of fire models. A numerical and experimental study of radiation from buoyant diffusion flames with and without impingement on a flat plate is reported. Quantitative images of the radiation intensity from the flames are acquired using a high speed infrared camera. Large eddy simulations are performed using fire dynamics simulator (FDS version 6). The species concentrations and temperature from the simulations are used in conjunction with a narrow-band radiation model (RADCAL) to solve the radiative transfer equation. The computed infrared radiation intensities rendered in the form of images and compared with the measurements. The measured and computed radiation intensities reveal necking and bulging with a characteristic frequency of 7.1 Hz which is in agreement with previous empirical correlations. The results demonstrate the effects of stagnation point boundary layer on the upstream buoyant shear layer. The coupling between these two shear layers presents a model problem for sub-grid scale modeling necessary for future large eddy simulations.

  8. Unsteady 3D flow simulations in cranial arterial tree

    NASA Astrophysics Data System (ADS)

    Grinberg, Leopold; Anor, Tomer; Madsen, Joseph; Karniadakis, George

    2008-11-01

    High resolution unsteady 3D flow simulations in major cranial arteries have been performed. Two cases were considered: 1) a healthy volunteer with a complete Circle of Willis (CoW); and 2) a patient with hydrocephalus and an incomplete CoW. Computation was performed on 3344 processors of the new half petaflop supercomputer in TACC. Two new numerical approaches were developed and implemented: 1) a new two-level domain decomposition method, which couples continuous and discontinuous Galerkin discretization of the computational domain; and 2) a new type of outflow boundary conditions, which imposes, in an accurate and computationally efficient manner, clinically measured flow rates. In the first simulation, a geometric model of 65 cranial arteries was reconstructed. Our simulation reveals a high degree of asymmetry in the flow at the left and right parts of the CoW and the presence of swirling flow in most of the CoW arteries. In the second simulation, one of the main findings was a high pressure drop at the right anterior communicating artery (PCA). Due to the incompleteness of the CoW and the pressure drop at the PCA, the right internal carotid artery supplies blood to most regions of the brain.

  9. Analysis of vibrational-translational energy transfer using the direct simulation Monte Carlo method

    NASA Technical Reports Server (NTRS)

    Boyd, Iain D.

    1991-01-01

    A new model is proposed for energy transfer between the vibrational and translational modes for use in the direct simulation Monte Carlo method (DSMC). The model modifies the Landau-Teller theory for a harmonic oscillator and the rate transition is related to an experimental correlation for the vibrational relaxation time. Assessment of the model is made with respect to three different computations: relaxation in a heat bath, a one-dimensional shock wave, and hypersonic flow over a two-dimensional wedge. These studies verify that the model achieves detailed balance, and excellent agreement with experimental data is obtained in the shock wave calculation. The wedge flow computation reveals that the usual phenomenological method for simulating vibrational nonequilibrium in the DSMC technique predicts much higher vibrational temperatures in the wake region.

  10. Advanced in Visualization of 3D Time-Dependent CFD Solutions

    NASA Technical Reports Server (NTRS)

    Lane, David A.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Numerical simulations of complex 3D time-dependent (unsteady) flows are becoming increasingly feasible because of the progress in computing systems. Unfortunately, many existing flow visualization systems were developed for time-independent (steady) solutions and do not adequately depict solutions from unsteady flow simulations. Furthermore, most systems only handle one time step of the solutions individually and do not consider the time-dependent nature of the solutions. For example, instantaneous streamlines are computed by tracking the particles using one time step of the solution. However, for streaklines and timelines, particles need to be tracked through all time steps. Streaklines can reveal quite different information about the flow than those revealed by instantaneous streamlines. Comparisons of instantaneous streamlines with dynamic streaklines are shown. For a complex 3D flow simulation, it is common to generate a grid system with several millions of grid points and to have tens of thousands of time steps. The disk requirement for storing the flow data can easily be tens of gigabytes. Visualizing solutions of this magnitude is a challenging problem with today's computer hardware technology. Even interactive visualization of one time step of the flow data can be a problem for some existing flow visualization systems because of the size of the grid. Current approaches for visualizing complex 3D time-dependent CFD solutions are described. The flow visualization system developed at NASA Ames Research Center to compute time-dependent particle traces from unsteady CFD solutions is described. The system computes particle traces (streaklines) by integrating through the time steps. This system has been used by several NASA scientists to visualize their CFD time-dependent solutions. The flow visualization capabilities of this system are described, and visualization results are shown.

  11. Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.

  12. Simulation of Nonlinear Instabilities in an Attachment-Line Boundary Layer

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.

    1996-01-01

    The linear and the nonlinear stability of disturbances that propagate along the attachment line of a three-dimensional boundary layer is considered. The spatially evolving disturbances in the boundary layer are computed by direct numerical simulation (DNS) of the unsteady, incompressible Navier-Stokes equations. Disturbances are introduced either by forcing at the in ow or by applying suction and blowing at the wall. Quasi-parallel linear stability theory and a nonparallel theory yield notably different stability characteristics for disturbances near the critical Reynolds number; the DNS results con rm the latter theory. Previously, a weakly nonlinear theory and computations revealed a high wave-number region of subcritical disturbance growth. More recent computations have failed to achieve this subcritical growth. The present computational results indicate the presence of subcritically growing disturbances; the results support the weakly nonlinear theory. Furthermore, an explanation is provided for the previous theoretical and computational discrepancy. In addition, the present results demonstrate that steady suction can be used to stabilize disturbances that otherwise grow subcritically along the attachment line.

  13. (Extreme) Core-collapse Supernova Simulations

    NASA Astrophysics Data System (ADS)

    Mösta, Philipp

    2017-01-01

    In this talk I will present recent progress on modeling core-collapse supernovae with massively parallel simulations on the largest supercomputers available. I will discuss the unique challenges in both input physics and computational modeling that come with a problem involving all four fundamental forces and relativistic effects and will highlight recent breakthroughs overcoming these challenges in full 3D simulations. I will pay particular attention to how these simulations can be used to reveal the engines driving some of the most extreme explosions and conclude by discussing what remains to be done in simulation work to maximize what we can learn from current and future time-domain astronomy transient surveys.

  14. Convergence of sampling in protein simulations

    NASA Astrophysics Data System (ADS)

    Hess, Berk

    2002-03-01

    With molecular dynamics protein dynamics can be simulated in atomic detail. Current computers are not fast enough to probe all available conformations, but fluctuations around one conformation can be sampled to a reasonable extent. The motions with the largest fluctuations can be filtered out of a simulation using covariance or principal component analysis. A problem with this analysis is that random diffusion can appear as correlated motion. An analysis is presented of how long a simulation should be to obtain relevant results for global motions. The analysis reveals that the cosine content of the principal components is a good indicator for bad sampling.

  15. Modelling NOX concentrations through CFD-RANS in an urban hot-spot using high resolution traffic emissions and meteorology from a mesoscale model

    NASA Astrophysics Data System (ADS)

    Sanchez, Beatriz; Santiago, Jose Luis; Martilli, Alberto; Martin, Fernando; Borge, Rafael; Quaassdorff, Christina; de la Paz, David

    2017-08-01

    Air quality management requires more detailed studies about air pollution at urban and local scale over long periods of time. This work focuses on obtaining the spatial distribution of NOx concentration averaged over several days in a heavily trafficked urban area in Madrid (Spain) using a computational fluid dynamics (CFD) model. A methodology based on weighted average of CFD simulations is applied computing the time evolution of NOx dispersion as a sequence of steady-state scenarios taking into account the actual atmospheric conditions. The inputs of emissions are estimated from the traffic emission model and the meteorological information used is derived from a mesoscale model. Finally, the computed concentration map correlates well with 72 passive samplers deployed in the research area. This work reveals the potential of using urban mesoscale simulations together with detailed traffic emissions so as to provide accurate maps of pollutant concentration at microscale using CFD simulations.

  16. Mechanical unfolding reveals stable 3-helix intermediates in talin and α-catenin

    PubMed Central

    2018-01-01

    Mechanical stability is a key feature in the regulation of structural scaffolding proteins and their functions. Despite the abundance of α-helical structures among the human proteome and their undisputed importance in health and disease, the fundamental principles of their behavior under mechanical load are poorly understood. Talin and α-catenin are two key molecules in focal adhesions and adherens junctions, respectively. In this study, we used a combination of atomistic steered molecular dynamics (SMD) simulations, polyprotein engineering, and single-molecule atomic force microscopy (smAFM) to investigate unfolding of these proteins. SMD simulations revealed that talin rod α-helix bundles as well as α-catenin α-helix domains unfold through stable 3-helix intermediates. While the 5-helix bundles were found to be mechanically stable, a second stable conformation corresponding to the 3-helix state was revealed. Mechanically weaker 4-helix bundles easily unfolded into a stable 3-helix conformation. The results of smAFM experiments were in agreement with the findings of the computational simulations. The disulfide clamp mutants, designed to protect the stable state, support the 3-helix intermediate model in both experimental and computational setups. As a result, multiple discrete unfolding intermediate states in the talin and α-catenin unfolding pathway were discovered. Better understanding of the mechanical unfolding mechanism of α-helix proteins is a key step towards comprehensive models describing the mechanoregulation of proteins. PMID:29698481

  17. Spatial smoothing coherence factor for ultrasound computed tomography

    NASA Astrophysics Data System (ADS)

    Lou, Cuijuan; Xu, Mengling; Ding, Mingyue; Yuchi, Ming

    2016-04-01

    In recent years, many research studies have been carried out on ultrasound computed tomography (USCT) for its application prospect in early diagnosis of breast cancer. This paper applies four kinds of coherence-factor-like beamforming methods to improve the image quality of synthetic aperture focusing method for USCT, including the coherence-factor (CF), the phase coherence factor (PCF), the sign coherence factor (SCF) and the spatial smoothing coherence factor (SSCF) (proposed in our previous work). The performance of these methods was tested with simulated raw data which were generated by the ultrasound simulation software PZFlex 2014. The simulated phantom was set to be water of 4cm diameter with three nylon objects of different diameters inside. The ring-type transducer had 72 elements with a center frequency of 1MHz. The results show that all the methods can reveal the biggest nylon circle with the radius of 2.5mm. SSCF gets the highest SNR among the proposed methods and provides a more homogenous background. None of these methods can reveal the two smaller nylon circles with the radius of 0.75mm and 0.25mm. This may be due to the small number of elements.

  18. Computational Assay of H7N9 Influenza Neuraminidase Reveals R292K Mutation Reduces Drug Binding Affinity

    NASA Astrophysics Data System (ADS)

    Woods, Christopher J.; Malaisree, Maturos; Long, Ben; McIntosh-Smith, Simon; Mulholland, Adrian J.

    2013-12-01

    The emergence of a novel H7N9 avian influenza that infects humans is a serious cause for concern. Of the genome sequences of H7N9 neuraminidase available, one contains a substitution of arginine to lysine at position 292, suggesting a potential for reduced drug binding efficacy. We have performed molecular dynamics simulations of oseltamivir, zanamivir and peramivir bound to H7N9, H7N9-R292K, and a structurally related H11N9 neuraminidase. They show that H7N9 neuraminidase is structurally homologous to H11N9, binding the drugs in identical modes. The simulations reveal that the R292K mutation disrupts drug binding in H7N9 in a comparable manner to that observed experimentally for H11N9-R292K. Absolute binding free energy calculations with the WaterSwap method confirm a reduction in binding affinity. This indicates that the efficacy of antiviral drugs against H7N9-R292K will be reduced. Simulations can assist in predicting disruption of binding caused by mutations in neuraminidase, thereby providing a computational `assay.'

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aliaga, José I., E-mail: aliaga@uji.es; Alonso, Pedro; Badía, José M.

    We introduce a new iterative Krylov subspace-based eigensolver for the simulation of macromolecular motions on desktop multithreaded platforms equipped with multicore processors and, possibly, a graphics accelerator (GPU). The method consists of two stages, with the original problem first reduced into a simpler band-structured form by means of a high-performance compute-intensive procedure. This is followed by a memory-intensive but low-cost Krylov iteration, which is off-loaded to be computed on the GPU by means of an efficient data-parallel kernel. The experimental results reveal the performance of the new eigensolver. Concretely, when applied to the simulation of macromolecules with a few thousandsmore » degrees of freedom and the number of eigenpairs to be computed is small to moderate, the new solver outperforms other methods implemented as part of high-performance numerical linear algebra packages for multithreaded architectures.« less

  20. An experimental phylogeny to benchmark ancestral sequence reconstruction

    PubMed Central

    Randall, Ryan N.; Radford, Caelan E.; Roof, Kelsey A.; Natarajan, Divya K.; Gaucher, Eric A.

    2016-01-01

    Ancestral sequence reconstruction (ASR) is a still-burgeoning method that has revealed many key mechanisms of molecular evolution. One criticism of the approach is an inability to validate its algorithms within a biological context as opposed to a computer simulation. Here we build an experimental phylogeny using the gene of a single red fluorescent protein to address this criticism. The evolved phylogeny consists of 19 operational taxonomic units (leaves) and 17 ancestral bifurcations (nodes) that display a wide variety of fluorescent phenotypes. The 19 leaves then serve as ‘modern' sequences that we subject to ASR analyses using various algorithms and to benchmark against the known ancestral genotypes and ancestral phenotypes. We confirm computer simulations that show all algorithms infer ancient sequences with high accuracy, yet we also reveal wide variation in the phenotypes encoded by incorrectly inferred sequences. Specifically, Bayesian methods incorporating rate variation significantly outperform the maximum parsimony criterion in phenotypic accuracy. Subsampling of extant sequences had minor effect on the inference of ancestral sequences. PMID:27628687

  1. Mushroom (Agaricus bisporus) polyphenoloxidase inhibited by apigenin: Multi-spectroscopic analyses and computational docking simulation.

    PubMed

    Xiong, Zhiqiang; Liu, Wei; Zhou, Lei; Zou, Liqiang; Chen, Jun

    2016-07-15

    It has been revealed that some polyphenols can prevent enzymatic browning caused by polyphenoloxidase (PPO). Apigenin, widely distributed in many fruits and vegetables, is an important bioactive flavonoid compound. In this study, apigenin exhibited a strong inhibitory activity against PPO, and some reagents had synergistic effect with apigenin on inhibiting PPO. Apigenin inhibited PPO activity reversibly in a mixed-type manner. The fact that inactivation rate constant (k) of PPO increased while activation energy (Ea) and thermodynamic parameters (ΔG, ΔH and ΔS) decreased indicated that the thermosensitivity and stability of PPO decreased. The conformational changes of PPO were revealed by fluorescence emission spectra and circular dichroism. Atomic force microscopy observation suggested that the dimension of PPO molecules was larger after interacting with apigenin. Moreover, computational docking simulation indicated that apigenin bound to PPO and inserted into the hydrophobic cavity of PPO to interact with some amino acid residues. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Hydrodynamic Simulations of Protoplanetary Disks with GIZMO

    NASA Astrophysics Data System (ADS)

    Rice, Malena; Laughlin, Greg

    2018-01-01

    Over the past several decades, the field of computational fluid dynamics has rapidly advanced as the range of available numerical algorithms and computationally feasible physical problems has expanded. The development of modern numerical solvers has provided a compelling opportunity to reconsider previously obtained results in search for yet undiscovered effects that may be revealed through longer integration times and more precise numerical approaches. In this study, we compare the results of past hydrodynamic disk simulations with those obtained from modern analytical resources. We focus our study on the GIZMO code (Hopkins 2015), which uses meshless methods to solve the homogeneous Euler equations of hydrodynamics while eliminating problems arising as a result of advection between grid cells. By comparing modern simulations with prior results, we hope to provide an improved understanding of the impact of fluid mechanics upon the evolution of protoplanetary disks.

  3. Low-Order Modeling of Internal Heat Transfer in Biomass Particle Pyrolysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiggins, Gavin M.; Ciesielski, Peter N.; Daw, C. Stuart

    2016-06-16

    We present a computationally efficient, one-dimensional simulation methodology for biomass particle heating under conditions typical of fast pyrolysis. Our methodology is based on identifying the rate limiting geometric and structural factors for conductive heat transport in biomass particle models with realistic morphology to develop low-order approximations that behave appropriately. Comparisons of transient temperature trends predicted by our one-dimensional method with three-dimensional simulations of woody biomass particles reveal good agreement, if the appropriate equivalent spherical diameter and bulk thermal properties are used. We conclude that, for particle sizes and heating regimes typical of fast pyrolysis, it is possible to simulate biomassmore » particle heating with reasonable accuracy and minimal computational overhead, even when variable size, aspherical shape, anisotropic conductivity, and complex, species-specific internal pore geometry are incorporated.« less

  4. Computational prediction of hemolysis in a centrifugal ventricular assist device.

    PubMed

    Pinotti, M; Rosa, E S

    1995-03-01

    This paper describes the use of computational fluid dynamics (CFD) to predict numerically the hemolysis in centrifugal pumps. A numerical hydrodynamical model, based on the full Navier-Stokes equation, was used to obtain the flow in a vaneless centrifugal pump (of corotating disks type). After proper postprocessing, critical zones in the channel were identified by means of two-dimensional color-coded maps of %Hb release. Simulation of different conditions revealed that flow behavior at the entrance region of the channel is the main cause of blood trauma in such devices. A useful feature resulting from the CFD simulation is the visualization of critical flow zones that are impossible to determine experimentally with in vitro hemolysis tests.

  5. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.

  6. Deformation of Soft Tissue and Force Feedback Using the Smoothed Particle Hydrodynamics

    PubMed Central

    Liu, Xuemei; Wang, Ruiyi; Li, Yunhua; Song, Dongdong

    2015-01-01

    We study the deformation and haptic feedback of soft tissue in virtual surgery based on a liver model by using a force feedback device named PHANTOM OMNI developed by SensAble Company in USA. Although a significant amount of research efforts have been dedicated to simulating the behaviors of soft tissue and implementing force feedback, it is still a challenging problem. This paper introduces a kind of meshfree method for deformation simulation of soft tissue and force computation based on viscoelastic mechanical model and smoothed particle hydrodynamics (SPH). Firstly, viscoelastic model can present the mechanical characteristics of soft tissue which greatly promotes the realism. Secondly, SPH has features of meshless technique and self-adaption, which supply higher precision than methods based on meshes for force feedback computation. Finally, a SPH method based on dynamic interaction area is proposed to improve the real time performance of simulation. The results reveal that SPH methodology is suitable for simulating soft tissue deformation and force feedback calculation, and SPH based on dynamic local interaction area has a higher computational efficiency significantly compared with usual SPH. Our algorithm has a bright prospect in the area of virtual surgery. PMID:26417380

  7. Spatial interpretation of NASA's Marshall Space Flight Center Payload Operations Control Center using virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1993-01-01

    In its search for higher level computer interfaces and more realistic electronic simulations for measurement and spatial analysis in human factors design, NASA at MSFC is evaluating the functionality of virtual reality (VR) technology. Virtual reality simulation generates a three dimensional environment in which the participant appears to be enveloped. It is a type of interactive simulation in which humans are not only involved, but included. Virtual reality technology is still in the experimental phase, but it appears to be the next logical step after computer aided three-dimensional animation in transferring the viewer from a passive to an active role in experiencing and evaluating an environment. There is great potential for using this new technology when designing environments for more successful interaction, both with the environment and with another participant in a remote location. At the University of North Carolina, a VR simulation of a the planned Sitterson Hall, revealed a flaw in the building's design that had not been observed during examination of the more traditional building plan simulation methods on paper and on computer aided design (CAD) work station. The virtual environment enables multiple participants in remote locations to come together and interact with one another and with the environment. Each participant is capable of seeing herself and the other participants and of interacting with them within the simulated environment.

  8. Computer Simulations of Small Molecules in Membranes: Insights from Computer Simulations into the Interactions of Small Molecules with Lipid Bilayers

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; New, Michael H.; Schweighofer, Karl; Wilson, Michael A.; DeVincenzi, Donald L. (Technical Monitor)

    2000-01-01

    Two of Ernest Overton's lasting contributions to biology are the Meyer-Overton relationship between the potency of an anesthetic and its solubility in oil, and the Overton rule which relates the permeability of a membrane to the oil-water partition coefficient of the permeating molecule. A growing body of experimental evidence, however, cannot be reconciled with these theories. In particular, the molecular nature of membranes, unknown to Overton, needs to be included in any description of these phenomena. Computer simulations are ideally suited for providing atomic-level information about the behavior of small molecules in membranes. The authors discuss simulation studies relevant to Overton's ideas. Through simulations it was found that anesthetics tend to concentrate at interfaces and their anesthetic potency correlates better with solubility at the water-membrane interface than with solubility in oil. Simulation studies of membrane permeation revealed the anisotropic nature of the membranes, as evidenced, for example, by the highly nonuniform distribution of free volume in the bilayer. This, in turn, influences the diffusion rates of solutes, which increase with the depth in the membrane. Small solutes tend to move by hopping between voids in the bilayer, and this hopping motion may be responsible for the deviation from the Overton rule of the permeation rates of these molecules.

  9. Modelling and Simulation of Search Engine

    NASA Astrophysics Data System (ADS)

    Nasution, Mahyuddin K. M.

    2017-01-01

    The best tool currently used to access information is a search engine. Meanwhile, the information space has its own behaviour. Systematically, an information space needs to be familiarized with mathematics so easily we identify the characteristics associated with it. This paper reveal some characteristics of search engine based on a model of document collection, which are then estimated the impact on the feasibility of information. We reveal some of characteristics of search engine on the lemma and theorem about singleton and doubleton, then computes statistically characteristic as simulating the possibility of using search engine. In this case, Google and Yahoo. There are differences in the behaviour of both search engines, although in theory based on the concept of documents collection.

  10. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between computational and measurement data in the bypass duct show that they are in good agreement, thus providing a partial validation of the computational results.

  11. A computational model for epidural electrical stimulation of spinal sensorimotor circuits.

    PubMed

    Capogrosso, Marco; Wenger, Nikolaus; Raspopovic, Stanisa; Musienko, Pavel; Beauparlant, Janine; Bassi Luciani, Lorenzo; Courtine, Grégoire; Micera, Silvestro

    2013-12-04

    Epidural electrical stimulation (EES) of lumbosacral segments can restore a range of movements after spinal cord injury. However, the mechanisms and neural structures through which EES facilitates movement execution remain unclear. Here, we designed a computational model and performed in vivo experiments to investigate the type of fibers, neurons, and circuits recruited in response to EES. We first developed a realistic finite element computer model of rat lumbosacral segments to identify the currents generated by EES. To evaluate the impact of these currents on sensorimotor circuits, we coupled this model with an anatomically realistic axon-cable model of motoneurons, interneurons, and myelinated afferent fibers for antagonistic ankle muscles. Comparisons between computer simulations and experiments revealed the ability of the model to predict EES-evoked motor responses over multiple intensities and locations. Analysis of the recruited neural structures revealed the lack of direct influence of EES on motoneurons and interneurons. Simulations and pharmacological experiments demonstrated that EES engages spinal circuits trans-synaptically through the recruitment of myelinated afferent fibers. The model also predicted the capacity of spatially distinct EES to modulate side-specific limb movements and, to a lesser extent, extension versus flexion. These predictions were confirmed during standing and walking enabled by EES in spinal rats. These combined results provide a mechanistic framework for the design of spinal neuroprosthetic systems to improve standing and walking after neurological disorders.

  12. Radiotherapy and chemotherapy change vessel tree geometry and metastatic spread in a small cell lung cancer xenograft mouse tumor model

    PubMed Central

    Bethge, Anja; Schumacher, Udo

    2017-01-01

    Background Tumor vasculature is critical for tumor growth, formation of distant metastases and efficiency of radio- and chemotherapy treatments. However, how the vasculature itself is affected during cancer treatment regarding to the metastatic behavior has not been thoroughly investigated. Therefore, the aim of this study was to analyze the influence of hypofractionated radiotherapy and cisplatin chemotherapy on vessel tree geometry and metastasis formation in a small cell lung cancer xenograft mouse tumor model to investigate the spread of malignant cells during different treatments modalities. Methods The biological data gained during these experiments were fed into our previously developed computer model “Cancer and Treatment Simulation Tool” (CaTSiT) to model the growth of the primary tumor, its metastatic deposit and also the influence on different therapies. Furthermore, we performed quantitative histology analyses to verify our predictions in xenograft mouse tumor model. Results According to the computer simulation the number of cells engrafting must vary considerably to explain the different weights of the primary tumor at the end of the experiment. Once a primary tumor is established, the fractal dimension of its vasculature correlates with the tumor size. Furthermore, the fractal dimension of the tumor vasculature changes during treatment, indicating that the therapy affects the blood vessels’ geometry. We corroborated these findings with a quantitative histological analysis showing that the blood vessel density is depleted during radiotherapy and cisplatin chemotherapy. The CaTSiT computer model reveals that chemotherapy influences the tumor’s therapeutic susceptibility and its metastatic spreading behavior. Conclusion Using a system biological approach in combination with xenograft models and computer simulations revealed that the usage of chemotherapy and radiation therapy determines the spreading behavior by changing the blood vessel geometry of the primary tumor. PMID:29107953

  13. Experimental and numerical characterisation of the elasto-plastic properties of bovine trabecular bone and a trabecular bone analogue.

    PubMed

    Kelly, Nicola; McGarry, J Patrick

    2012-05-01

    The inelastic pressure dependent compressive behaviour of bovine trabecular bone is investigated through experimental and computational analysis. Two loading configurations are implemented, uniaxial and confined compression, providing two distinct loading paths in the von Mises-pressure stress plane. Experimental results reveal distinctive yielding followed by a constant nominal stress plateau for both uniaxial and confined compression. Computational simulation of the experimental tests using the Drucker-Prager and Mohr-Coulomb plasticity models fails to capture the confined compression behaviour of trabecular bone. The high pressure developed during confined compression does not result in plastic deformation using these formulations, and a near elastic response is computed. In contrast, the crushable foam plasticity models provide accurate simulation of the confined compression tests, with distinctive yield and plateau behaviour being predicted. The elliptical yield surfaces of the crushable foam formulations in the von Mises-pressure stress plane accurately characterise the plastic behaviour of trabecular bone. Results reveal that the hydrostatic yield stress is equal to the uniaxial yield stress for trabecular bone, demonstrating the importance of accurate characterisation and simulation of the pressure dependent plasticity. It is also demonstrated in this study that a commercially available trabecular bone analogue material, cellular rigid polyurethane foam, exhibits similar pressure dependent yield behaviour, despite having a lower stiffness and strength than trabecular bone. This study provides a novel insight into the pressure dependent yield behaviour of trabecular bone, demonstrating the inadequacy of uniaxial testing alone. For the first time, crushable foam plasticity formulations are implemented for trabecular bone. The enhanced understanding of the inelastic behaviour of trabecular bone established in this study will allow for more realistic simulation of orthopaedic device implantation and failure. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Toward Theory-Based Instruction in Scientific Problem Solving.

    ERIC Educational Resources Information Center

    Heller, Joan I.; And Others

    Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…

  15. Public Schools Energy Conservation Measures, Report Number 4: Hindman Elementary School, Hindman, Kentucky.

    ERIC Educational Resources Information Center

    American Association of School Administrators, Arlington, VA.

    Presented is a study identifying and evaluating opportunities for decreasing energy use at Hindman Elementary School, Hindman, Kentucky. Methods used in this engineering investigation include building surveys, computer simulations and cost estimates. Findings revealed that modifications to the school's boiler, temperature controls, electrical…

  16. The Dynamics of Information Search Services.

    ERIC Educational Resources Information Center

    Lindquist, Mats G.

    Computer-based information search services (ISSs) of the type that provide online literature searches are analyzed from a systems viewpoint using a continuous simulation model. The methodology applied is "system dynamics," and the system language is DYNAMO. The analysis reveals that the observed growth and stagnation of a typical ISS can…

  17. The Influence of Simulated Home and Neighbourhood Densification on Perceived Liveability

    ERIC Educational Resources Information Center

    Thomas, J. A.; Walton, D.; Lamb, S.

    2011-01-01

    This study experimentally manipulated neighbourhood density and home location to reveal the effect of these changes on perceived liveability. Two hypothetical scenarios were provided to 106 households using a Computer-Aided Personal Interview (CAPI). The first scenario examined a densification of the participant's current property, and the second…

  18. Li-Doped Ionic Liquid Electrolytes: From Bulk Phase to Interfacial Behavior

    NASA Technical Reports Server (NTRS)

    Haskins, Justin B.; Lawson, John W.

    2016-01-01

    Ionic liquids have been proposed as candidate electrolytes for high-energy density, rechargeable batteries. We present an extensive computational analysis supported by experimental comparisons of the bulk and interfacial properties of a representative set of these electrolytes as a function of Li-salt doping. We begin by investigating the bulk electrolyte using quantum chemistry and ab initio molecular dynamics to elucidate the solvation structure of Li(+). MD simulations using the polarizable force field of Borodin and coworkers were then performed, from which we obtain an array of thermodynamic and transport properties. Excellent agreement is found with experiments for diffusion, ionic conductivity, and viscosity. Combining MD simulations with electronic structure computations, we computed the electrochemical window of the electrolytes across a range of Li(+)-doping levels and comment on the role of the liquid environment. Finally, we performed a suite of simulations of these Li-doped electrolytes at ideal electrified interfaces to evaluate the differential capacitance and the equilibrium Li(+) distribution in the double layer. The magnitude of differential capacitance is in good agreement with our experiments and exhibits the characteristic camel-shaped profile. In addition, the simulations reveal Li(+) to be highly localized to the second molecular layer of the double layer, which is supported by additional computations that find this layer to be a free energy minimum with respect to Li(+) translation.

  19. Preconditioning for Numerical Simulation of Low Mach Number Three-Dimensional Viscous Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.; Chima, Rodrick V.; Turkel, Eli

    1997-01-01

    A preconditioning scheme has been implemented into a three-dimensional viscous computational fluid dynamics code for turbomachine blade rows. The preconditioning allows the code, originally developed for simulating compressible flow fields, to be applied to nearly-incompressible, low Mach number flows. A brief description is given of the compressible Navier-Stokes equations for a rotating coordinate system, along with the preconditioning method employed. Details about the conservative formulation of artificial dissipation are provided, and different artificial dissipation schemes are discussed and compared. The preconditioned code was applied to a well-documented case involving the NASA large low-speed centrifugal compressor for which detailed experimental data are available for comparison. Performance and flow field data are compared for the near-design operating point of the compressor, with generally good agreement between computation and experiment. Further, significant differences between computational results for the different numerical implementations, revealing different levels of solution accuracy, are discussed.

  20. A fast, open source implementation of adaptive biasing potentials uncovers a ligand design strategy for the chromatin regulator BRD4

    NASA Astrophysics Data System (ADS)

    Dickson, Bradley M.; de Waal, Parker W.; Ramjan, Zachary H.; Xu, H. Eric; Rothbart, Scott B.

    2016-10-01

    In this communication we introduce an efficient implementation of adaptive biasing that greatly improves the speed of free energy computation in molecular dynamics simulations. We investigated the use of accelerated simulations to inform on compound design using a recently reported and clinically relevant inhibitor of the chromatin regulator BRD4 (bromodomain-containing protein 4). Benchmarking on our local compute cluster, our implementation achieves up to 2.5 times more force calls per day than plumed2. Results of five 1 μs-long simulations are presented, which reveal a conformational switch in the BRD4 inhibitor between a binding competent and incompetent state. Stabilization of the switch led to a -3 kcal/mol improvement of absolute binding free energy. These studies suggest an unexplored ligand design principle and offer new actionable hypotheses for medicinal chemistry efforts against this druggable epigenetic target class.

  1. A fast, open source implementation of adaptive biasing potentials uncovers a ligand design strategy for the chromatin regulator BRD4

    PubMed Central

    Dickson, Bradley M.; Ramjan, Zachary H.; Xu, H. Eric

    2016-01-01

    In this communication we introduce an efficient implementation of adaptive biasing that greatly improves the speed of free energy computation in molecular dynamics simulations. We investigated the use of accelerated simulations to inform on compound design using a recently reported and clinically relevant inhibitor of the chromatin regulator BRD4 (bromodomain-containing protein 4). Benchmarking on our local compute cluster, our implementation achieves up to 2.5 times more force calls per day than plumed2. Results of five 1 μs-long simulations are presented, which reveal a conformational switch in the BRD4 inhibitor between a binding competent and incompetent state. Stabilization of the switch led to a −3 kcal/mol improvement of absolute binding free energy. These studies suggest an unexplored ligand design principle and offer new actionable hypotheses for medicinal chemistry efforts against this druggable epigenetic target class. PMID:27782467

  2. Employing static excitation control and tie line reactance to stabilize wind turbine generators

    NASA Technical Reports Server (NTRS)

    Hwang, H. H.; Mozeico, H. V.; Guo, T.

    1978-01-01

    An analytical representation of a wind turbine generator is presented which employs blade pitch angle feedback control. A mathematical model was formulated. With the functioning MOD-0 wind turbine serving as a practical case study, results of computer simulations of the model as applied to the problem of dynamic stability at rated load are also presented. The effect of the tower shadow was included in the input to the system. Different configurations of the drive train, and optimal values of the tie line reactance were used in the simulations. Computer results revealed that a static excitation control system coupled with optimal values of the tie line reactance would effectively reduce oscillations of the power output, without the use of a slip clutch.

  3. Numerical simulation of the helium gas spin-up channel performance of the relativity gyroscope

    NASA Technical Reports Server (NTRS)

    Karr, Gerald R.; Edgell, Josephine; Zhang, Burt X.

    1991-01-01

    The dependence of the spin-up system efficiency on each geometrical parameter of the spin-up channel and the exhaust passage of the Gravity Probe-B (GPB) is individually investigated. The spin-up model is coded into a computer program which simulates the spin-up process. Numerical results reveal optimal combinations of the geometrical parameters for the ultimate spin-up performance. Comparisons are also made between the numerical results and experimental data. The experimental leakage rate can only be reached when the gap between the channel lip and the rotor surface increases beyond physical limit. The computed rotating frequency is roughly twice as high as the measured ones although the spin-up torques fairly match.

  4. Computational Aerodynamic Simulations of a 1484 ft/sec Tip Speed Quiet High-Speed Fan System Model for Acoustic Methods Assessment and Development

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of a 1484 ft/sec tip speed quiet high-speed fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, low-noise research fan/nacelle model that has undergone experimental testing in the 9- by 15-foot Low Speed Wind Tunnel at the NASA Glenn Research Center. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating points simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, which includes a core duct and a bypass duct that merge upstream of the fan system nozzle. As a result, only fan rotational speed and the system bypass ratio, set by means of a translating nozzle plug, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. Computed blade row flow fields at all fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive or critical boundary layer separations or related secondary-flow problems, with the exception of the hub boundary layer at the core duct entrance. At that location a significant flow separation is present. The region of local flow recirculation extends through a mixing plane, however, which for the particular mixing-plane model used is now known to exaggerate the recirculation. In any case, the flow separation has relatively little impact on the computed rotor and FEGV flow fields.

  5. Biomes computed from simulated climatologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Claussen, M.; Esch, M.

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a differencemore » in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.« less

  6. Interleaved concatenated codes: new perspectives on approaching the Shannon limit.

    PubMed

    Viterbi, A J; Viterbi, A M; Sindhushayana, N T

    1997-09-02

    The last few years have witnessed a significant decrease in the gap between the Shannon channel capacity limit and what is practically achievable. Progress has resulted from novel extensions of previously known coding techniques involving interleaved concatenated codes. A considerable body of simulation results is now available, supported by an important but limited theoretical basis. This paper presents a computational technique which further ties simulation results to the known theory and reveals a considerable reduction in the complexity required to approach the Shannon limit.

  7. Theories of Spoken Word Recognition Deficits in Aphasia: Evidence from Eye-Tracking and Computational Modeling

    PubMed Central

    Mirman, Daniel; Yee, Eiling; Blumstein, Sheila E.; Magnuson, James S.

    2011-01-01

    We used eye tracking to investigate lexical processing in aphasic participants by examining the fixation time course for rhyme (e.g., carrot – parrot) and cohort (e.g., beaker – beetle) competitors. Broca’s aphasic participants exhibited larger rhyme competition effects than age-matched controls. A reanalysis of previously reported data (Yee, Blumstein, & Sedivy, 2008) confirmed that Wernicke’s aphasic participants exhibited larger cohort competition effects. Individual-level analyses revealed a negative correlation between rhyme and cohort competition effect size across both groups of aphasic participants. Computational model simulations were performed to examine which of several accounts of lexical processing deficits in aphasia might account for the observed effects. Simulation results revealed that slower deactivation of lexical competitors could account for increased cohort competition in Wernicke’s aphasic participants; auditory perceptual impairment could account for increased rhyme competition in Broca's aphasic participants; and a perturbation of a parameter controlling selection among competing alternatives could account for both patterns, as well as the correlation between the effects. In light of these simulation results, we discuss theoretical accounts that have the potential to explain the dynamics of spoken word recognition in aphasia and the possible roles of anterior and posterior brain regions in lexical processing and cognitive control. PMID:21371743

  8. Octree-based Global Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Ramirez-Guzman, L.; Juarez, A.; Bielak, J.; Salazar Monroy, E. F.

    2017-12-01

    Seismological research has motivated recent efforts to construct more accurate three-dimensional (3D) velocity models of the Earth, perform global simulations of wave propagation to validate models, and also to study the interaction of seismic fields with 3D structures. However, traditional methods for seismogram computation at global scales are limited by computational resources, relying primarily on traditional methods such as normal mode summation or two-dimensional numerical methods. We present an octree-based mesh finite element implementation to perform global earthquake simulations with 3D models using topography and bathymetry with a staircase approximation, as modeled by the Carnegie Mellon Finite Element Toolchain Hercules (Tu et al., 2006). To verify the implementation, we compared the synthetic seismograms computed in a spherical earth against waveforms calculated using normal mode summation for the Preliminary Earth Model (PREM) for a point source representation of the 2014 Mw 7.3 Papanoa, Mexico earthquake. We considered a 3 km-thick ocean layer for stations with predominantly oceanic paths. Eigen frequencies and eigen functions were computed for toroidal, radial, and spherical oscillations in the first 20 branches. Simulations are valid at frequencies up to 0.05 Hz. Matching among the waveforms computed by both approaches, especially for long period surface waves, is excellent. Additionally, we modeled the Mw 9.0 Tohoku-Oki earthquake using the USGS finite fault inversion. Topography and bathymetry from ETOPO1 are included in a mesh with more than 3 billion elements; constrained by the computational resources available. We compared estimated velocity and GPS synthetics against observations at regional and teleseismic stations of the Global Seismological Network and discuss the differences among observations and synthetics, revealing that heterogeneity, particularly in the crust, needs to be considered.

  9. Note on the artefacts in SRIM simulation of sputtering

    NASA Astrophysics Data System (ADS)

    Shulga, V. I.

    2018-05-01

    The computer simulation program SRIM, unlike other well-known programs (MARLOWE, TRIM.SP, etc.), predicts non-zero values of the sputter yield at glancing ion bombardment of smooth amorphous targets and, for heavy ions, greatly underestimates the sputter yield at normal incidence. To understand the reasons for this, the sputtering of amorphous silicon bombarded with different ions was modeled here using the author's program OKSANA. Most simulations refer to 1 keV Xe ions, and angles of incidence cover range from 0 (normal incidence) to almost 90°. It has been shown that SRIM improperly simulates the initial stage of the sputtering process. Some other artefacts in SRIM calculations of sputtering are also revealed and discussed.

  10. Parallel tempering simulation of the three-dimensional Edwards-Anderson model with compact asynchronous multispin coding on GPU

    NASA Astrophysics Data System (ADS)

    Fang, Ye; Feng, Sheng; Tam, Ka-Ming; Yun, Zhifeng; Moreno, Juana; Ramanujam, J.; Jarrell, Mark

    2014-10-01

    Monte Carlo simulations of the Ising model play an important role in the field of computational statistical physics, and they have revealed many properties of the model over the past few decades. However, the effect of frustration due to random disorder, in particular the possible spin glass phase, remains a crucial but poorly understood problem. One of the obstacles in the Monte Carlo simulation of random frustrated systems is their long relaxation time making an efficient parallel implementation on state-of-the-art computation platforms highly desirable. The Graphics Processing Unit (GPU) is such a platform that provides an opportunity to significantly enhance the computational performance and thus gain new insight into this problem. In this paper, we present optimization and tuning approaches for the CUDA implementation of the spin glass simulation on GPUs. We discuss the integration of various design alternatives, such as GPU kernel construction with minimal communication, memory tiling, and look-up tables. We present a binary data format, Compact Asynchronous Multispin Coding (CAMSC), which provides an additional 28.4% speedup compared with the traditionally used Asynchronous Multispin Coding (AMSC). Our overall design sustains a performance of 33.5 ps per spin flip attempt for simulating the three-dimensional Edwards-Anderson model with parallel tempering, which significantly improves the performance over existing GPU implementations.

  11. BEM-based simulation of lung respiratory deformation for CT-guided biopsy.

    PubMed

    Chen, Dong; Chen, Weisheng; Huang, Lipeng; Feng, Xuegang; Peters, Terry; Gu, Lixu

    2017-09-01

    Accurate and real-time prediction of the lung and lung tumor deformation during respiration are important considerations when performing a peripheral biopsy procedure. However, most existing work focused on offline whole lung simulation using 4D image data, which is not applicable in real-time image-guided biopsy with limited image resources. In this paper, we propose a patient-specific biomechanical model based on the boundary element method (BEM) computed from CT images to estimate the respiration motion of local target lesion region, vessel tree and lung surface for the real-time biopsy guidance. This approach applies pre-computation of various BEM parameters to facilitate the requirement for real-time lung motion simulation. The resulting boundary condition at end inspiratory phase is obtained using a nonparametric discrete registration with convex optimization, and the simulation of the internal tissue is achieved by applying a tetrahedron-based interpolation method depend on expert-determined feature points on the vessel tree model. A reference needle is tracked to update the simulated lung motion during biopsy guidance. We evaluate the model by applying it for respiratory motion estimations of ten patients. The average symmetric surface distance (ASSD) and the mean target registration error (TRE) are employed to evaluate the proposed model. Results reveal that it is possible to predict the lung motion with ASSD of [Formula: see text] mm and a mean TRE of [Formula: see text] mm at largest over the entire respiratory cycle. In the CT-/electromagnetic-guided biopsy experiment, the whole process was assisted by our BEM model and final puncture errors in two studies were 3.1 and 2.0 mm, respectively. The experiment results reveal that both the accuracy of simulation and real-time performance meet the demands of clinical biopsy guidance.

  12. Parametric Study of Decay of Homogeneous Isotropic Turbulence Using Large Eddy Simulation

    NASA Technical Reports Server (NTRS)

    Swanson, R. C.; Rumsey, Christopher L.; Rubinstein, Robert; Balakumar, Ponnampalam; Zang, Thomas A.

    2012-01-01

    Numerical simulations of decaying homogeneous isotropic turbulence are performed with both low-order and high-order spatial discretization schemes. The turbulent Mach and Reynolds numbers for the simulations are 0.2 and 250, respectively. For the low-order schemes we use either second-order central or third-order upwind biased differencing. For higher order approximations we apply weighted essentially non-oscillatory (WENO) schemes, both with linear and nonlinear weights. There are two objectives in this preliminary effort to investigate possible schemes for large eddy simulation (LES). One is to explore the capability of a widely used low-order computational fluid dynamics (CFD) code to perform LES computations. The other is to determine the effect of higher order accuracy (fifth, seventh, and ninth order) achieved with high-order upwind biased WENO-based schemes. Turbulence statistics, such as kinetic energy, dissipation, and skewness, along with the energy spectra from simulations of the decaying turbulence problem are used to assess and compare the various numerical schemes. In addition, results from the best performing schemes are compared with those from a spectral scheme. The effects of grid density, ranging from 32 cubed to 192 cubed, on the computations are also examined. The fifth-order WENO-based scheme is found to be too dissipative, especially on the coarser grids. However, with the seventh-order and ninth-order WENO-based schemes we observe a significant improvement in accuracy relative to the lower order LES schemes, as revealed by the computed peak in the energy dissipation and by the energy spectrum.

  13. Integrative metabolomics for characterizing unknown low-abundance metabolites by capillary electrophoresis-mass spectrometry with computer simulations.

    PubMed

    Lee, Richard; Ptolemy, Adam S; Niewczas, Liliana; Britz-McKibbin, Philip

    2007-01-15

    Characterization of unknown low-abundance metabolites in biological samples is one the most significant challenges in metabolomic research. In this report, an integrative strategy based on capillary electrophoresis-electrospray ionization-ion trap mass spectrometry (CE-ESI-ITMS) with computer simulations is examined as a multiplexed approach for studying the selective nutrient uptake behavior of E. coli within a complex broth medium. On-line sample preconcentration with desalting by CE-ESI-ITMS was performed directly without off-line sample pretreatment in order to improve detector sensitivity over 50-fold for cationic metabolites with nanomolar detection limits. The migration behavior of charged metabolites were also modeled in CE as a qualitative tool to support MS characterization based on two fundamental analyte physicochemical properties, namely, absolute mobility (muo) and acid dissociation constant (pKa). Computer simulations using Simul 5.0 were used to better understand the dynamics of analyte electromigration, as well as aiding de novo identification of unknown nutrients. There was excellent agreement between computer-simulated and experimental electropherograms for several classes of cationic metabolites as reflected by their relative migration times with an average error of <2.0%. Our studies revealed differential uptake of specific amino acids and nucleoside nutrients associated with distinct stages of bacterial growth. Herein, we demonstrate that CE can serve as an effective preconcentrator, desalter, and separator prior to ESI-MS, while providing additional qualitative information for unambiguous identification among isobaric and isomeric metabolites. The proposed strategy is particularly relevant for characterizing unknown yet biologically relevant metabolites that are not readily synthesized or commercially available.

  14. Application of classical simulations for the computation of vibrational properties of free molecules.

    PubMed

    Tikhonov, Denis S; Sharapa, Dmitry I; Schwabedissen, Jan; Rybkin, Vladimir V

    2016-10-12

    In this study, we investigate the ability of classical molecular dynamics (MD) and Monte-Carlo (MC) simulations for modeling the intramolecular vibrational motion. These simulations were used to compute thermally-averaged geometrical structures and infrared vibrational intensities for a benchmark set previously studied by gas electron diffraction (GED): CS 2 , benzene, chloromethylthiocyanate, pyrazinamide and 9,12-I 2 -1,2-closo-C 2 B 10 H 10 . The MD sampling of NVT ensembles was performed using chains of Nose-Hoover thermostats (NH) as well as the generalized Langevin equation thermostat (GLE). The performance of the theoretical models based on the classical MD and MC simulations was compared with the experimental data and also with the alternative computational techniques: a conventional approach based on the Taylor expansion of potential energy surface, path-integral MD and MD with quantum-thermal bath (QTB) based on the generalized Langevin equation (GLE). A straightforward application of the classical simulations resulted, as expected, in poor accuracy of the calculated observables due to the complete neglect of quantum effects. However, the introduction of a posteriori quantum corrections significantly improved the situation. The application of these corrections for MD simulations of the systems with large-amplitude motions was demonstrated for chloromethylthiocyanate. The comparison of the theoretical vibrational spectra has revealed that the GLE thermostat used in this work is not applicable for this purpose. On the other hand, the NH chains yielded reasonably good results.

  15. A Structured-Inquiry Approach to Teaching Neurophysiology Using Computer Simulation

    PubMed Central

    Crisp, Kevin M.

    2012-01-01

    Computer simulation is a valuable tool for teaching the fundamentals of neurophysiology in undergraduate laboratories where time and equipment limitations restrict the amount of course content that can be delivered through hands-on interaction. However, students often find such exercises to be tedious and unstimulating. In an effort to engage students in the use of computational modeling while developing a deeper understanding of neurophysiology, an attempt was made to use an educational neurosimulation environment as the basis for a novel, inquiry-based research project. During the semester, students in the class wrote a research proposal, used the Neurodynamix II simulator to generate a large data set, analyzed their modeling results statistically, and presented their findings at the Midbrains Neuroscience Consortium undergraduate poster session. Learning was assessed in the form of a series of short term papers and two 10-min in-class writing responses to the open-ended question, “How do ion channels influence neuronal firing?”, which they completed on weeks 6 and 15 of the semester. Students’ answers to this question showed a deeper understanding of neuronal excitability after the project; their term papers revealed evidence of critical thinking about computational modeling and neuronal excitability. Suggestions for the adaptation of this structured-inquiry approach into shorter term lab experiences are discussed. PMID:23494064

  16. Distribution of recombination hotspots in the human genome--a comparison of computer simulations with real data.

    PubMed

    Mackiewicz, Dorota; de Oliveira, Paulo Murilo Castro; Moss de Oliveira, Suzana; Cebrat, Stanisław

    2013-01-01

    Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar.

  17. Comparative simulation study of chemical synthesis of functional DADNE material.

    PubMed

    Liu, Min Hsien; Liu, Chuan Wen

    2017-01-01

    Amorphous molecular simulation to model the reaction species in the synthesis of chemically inert and energetic 1,1-diamino-2,2-dinitroethene (DADNE) explosive material was performed in this work. Nitromethane was selected as the starting reactant to undergo halogenation, nitration, deprotonation, intermolecular condensation, and dehydration to produce the target DADNE product. The Materials Studio (MS) forcite program allowed fast energy calculations and reliable geometric optimization of all aqueous molecular reaction systems (0.1-0.5 M) at 283 K and 298 K. The MS forcite-computed and Gaussian polarizable continuum model (PCM)-computed results were analyzed and compared in order to explore feasible reaction pathways under suitable conditions for the synthesis of DADNE. Through theoretical simulation, the findings revealed that synthesis was possible, and a total energy barrier of 449.6 kJ mol -1 needed to be overcome in order to carry out the reaction according to MS calculation of the energy barriers at each stage at 283 K, as shown by the reaction profiles. Local analysis of intermolecular interaction, together with calculation of the stabilization energy of each reaction system, provided information that can be used as a reference regarding molecular integrated stability. Graphical Abstract Materials Studio software has been suggested for the computation and simulation of DADNE synthesis.

  18. Distribution of Recombination Hotspots in the Human Genome – A Comparison of Computer Simulations with Real Data

    PubMed Central

    Mackiewicz, Dorota; de Oliveira, Paulo Murilo Castro; Moss de Oliveira, Suzana; Cebrat, Stanisław

    2013-01-01

    Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar. PMID:23776462

  19. WE-C-217BCD-08: Rapid Monte Carlo Simulations of DQE(f) of Scintillator-Based Detectors.

    PubMed

    Star-Lack, J; Abel, E; Constantin, D; Fahrig, R; Sun, M

    2012-06-01

    Monte Carlo simulations of DQE(f) can greatly aid in the design of scintillator-based detectors by helping optimize key parameters including scintillator material and thickness, pixel size, surface finish, and septa reflectivity. However, the additional optical transport significantly increases simulation times, necessitating a large number of parallel processors to adequately explore the parameter space. To address this limitation, we have optimized the DQE(f) algorithm, reducing simulation times per design iteration to 10 minutes on a single CPU. DQE(f) is proportional to the ratio, MTF(f)̂2 /NPS(f). The LSF-MTF simulation uses a slanted line source and is rapidly performed with relatively few gammas launched. However, the conventional NPS simulation for standard radiation exposure levels requires the acquisition of multiple flood fields (nRun), each requiring billions of input gamma photons (nGamma), many of which will scintillate, thereby producing thousands of optical photons (nOpt) per deposited MeV. The resulting execution time is proportional to the product nRun x nGamma x nOpt. In this investigation, we revisit the theoretical derivation of DQE(f), and reveal significant computation time savings through the optimization of nRun, nGamma, and nOpt. Using GEANT4, we determine optimal values for these three variables for a GOS scintillator-amorphous silicon portal imager. Both isotropic and Mie optical scattering processes were modeled. Simulation results were validated against the literature. We found that, depending on the radiative and optical attenuation properties of the scintillator, the NPS can be accurately computed using values for nGamma below 1000, and values for nOpt below 500/MeV. nRun should remain above 200. Using these parameters, typical computation times for a complete NPS ranged from 2-10 minutes on a single CPU. The number of launched particles and corresponding execution times for a DQE simulation can be dramatically reduced allowing for accurate computation with modest computer hardware. NIHRO1 CA138426. Several authors work for Varian Medical Systems. © 2012 American Association of Physicists in Medicine.

  20. Thermal-hydraulics modeling for prototype testing of the W7-X high heat flux scraper element

    DOE PAGES

    Clark, Emily; Lumsdaine, Arnold; Boscary, Jean; ...

    2017-07-28

    The long-pulse operation of the Wendelstein 7-X (W7-X) stellarator experiment is scheduled to begin in 2020. This operational phase will be equipped with water-cooled plasma facing components to allow for longer pulse durations. Certain simulated plasma scenarios have been shown to produce heat fluxes that surpass the technological limits on the edges of the divertor target elements during steady-state operation. In order to reduce the heat load on the target elements, the addition of a “scraper element” (SE) is under investigation. The SE is composed of 24 water-cooled carbon fiber reinforced carbon composite monoblock units. Multiple full-scale prototypes have beenmore » tested in the GLADIS high heat flux test facility. Previous computational studies revealed discrepancies between the simulations and experimental measurements. In this work, single-phase thermal-hydraulics modeling was performed in ANSYS CFX to identify potential causes for such discrepancies. Possible explanations investigated were the effects of a non-uniform thermal contact resistance and a potential misalignment of the monoblock fibers. And while the difference between the experimental and computational results was not resolved by a non-uniform thermal contact resistance, the computational results provided insight into the potential performance of a W7-X monoblock unit. Circumferential temperature distributions highlighted the expected boiling regions of such a unit. Finally, simulations revealed that modest angles of fiber misalignment in the monoblocks result in asymmetries at the unit edges and provide temperature differences similar to the experimental results.« less

  1. Thermal-hydraulics modeling for prototype testing of the W7-X high heat flux scraper element

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Emily; Lumsdaine, Arnold; Boscary, Jean

    The long-pulse operation of the Wendelstein 7-X (W7-X) stellarator experiment is scheduled to begin in 2020. This operational phase will be equipped with water-cooled plasma facing components to allow for longer pulse durations. Certain simulated plasma scenarios have been shown to produce heat fluxes that surpass the technological limits on the edges of the divertor target elements during steady-state operation. In order to reduce the heat load on the target elements, the addition of a “scraper element” (SE) is under investigation. The SE is composed of 24 water-cooled carbon fiber reinforced carbon composite monoblock units. Multiple full-scale prototypes have beenmore » tested in the GLADIS high heat flux test facility. Previous computational studies revealed discrepancies between the simulations and experimental measurements. In this work, single-phase thermal-hydraulics modeling was performed in ANSYS CFX to identify potential causes for such discrepancies. Possible explanations investigated were the effects of a non-uniform thermal contact resistance and a potential misalignment of the monoblock fibers. And while the difference between the experimental and computational results was not resolved by a non-uniform thermal contact resistance, the computational results provided insight into the potential performance of a W7-X monoblock unit. Circumferential temperature distributions highlighted the expected boiling regions of such a unit. Finally, simulations revealed that modest angles of fiber misalignment in the monoblocks result in asymmetries at the unit edges and provide temperature differences similar to the experimental results.« less

  2. Simulating physiological interactions in a hybrid system of mathematical models.

    PubMed

    Kretschmer, Jörn; Haunsberger, Thomas; Drost, Erick; Koch, Edmund; Möller, Knut

    2014-12-01

    Mathematical models can be deployed to simulate physiological processes of the human organism. Exploiting these simulations, reactions of a patient to changes in the therapy regime can be predicted. Based on these predictions, medical decision support systems (MDSS) can help in optimizing medical therapy. An MDSS designed to support mechanical ventilation in critically ill patients should not only consider respiratory mechanics but should also consider other systems of the human organism such as gas exchange or blood circulation. A specially designed framework allows combining three model families (respiratory mechanics, cardiovascular dynamics and gas exchange) to predict the outcome of a therapy setting. Elements of the three model families are dynamically combined to form a complex model system with interacting submodels. Tests revealed that complex model combinations are not computationally feasible. In most patients, cardiovascular physiology could be simulated by simplified models decreasing computational costs. Thus, a simplified cardiovascular model that is able to reproduce basic physiological behavior is introduced. This model purely consists of difference equations and does not require special algorithms to be solved numerically. The model is based on a beat-to-beat model which has been extended to react to intrathoracic pressure levels that are present during mechanical ventilation. The introduced reaction to intrathoracic pressure levels as found during mechanical ventilation has been tuned to mimic the behavior of a complex 19-compartment model. Tests revealed that the model is able to represent general system behavior comparable to the 19-compartment model closely. Blood pressures were calculated with a maximum deviation of 1.8 % in systolic pressure and 3.5 % in diastolic pressure, leading to a simulation error of 0.3 % in cardiac output. The gas exchange submodel being reactive to changes in cardiac output showed a resulting deviation of less than 0.1 %. Therefore, the proposed model is usable in combinations where cardiovascular simulation does not have to be detailed. Computing costs have been decreased dramatically by a factor 186 compared to a model combination employing the 19-compartment model.

  3. A physics-motivated Centroidal Voronoi Particle domain decomposition method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Lin, E-mail: lin.fu@tum.de; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de; Adams, Nikolaus A., E-mail: nikolaus.adams@tum.de

    2017-04-15

    In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state ismore » developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.« less

  4. Multisensor surveillance data augmentation and prediction with optical multipath signal processing

    NASA Astrophysics Data System (ADS)

    Bush, G. T., III

    1980-12-01

    The spatial characteristics of an oil spill on the high seas are examined in the interest of determining whether linear-shift-invariant data processing implemented on an optical computer would be a useful tool in analyzing spill behavior. Simulations were performed on a digital computer using data obtained from a 25,000 gallon spill of soy bean oil in the open ocean. Marked changes occurred in the observed spatial frequencies when the oil spill was encountered. An optical detector may readily be developed to sound an alarm automatically when this happens. The average extent of oil spread between sequential observations was quantified by a simulation of non-holographic optical computation. Because a zero crossover was available in this computation, it may be possible to construct a system to measure automatically the amount of spread. Oil images were subjected to deconvolutional filtering to reveal the force field which acted upon the oil to cause spreading. Some features of spill-size prediction were observed. Calculations based on two sequential photos produced an image which exhibited characteristics of the third photo in that sequence.

  5. A physics-motivated Centroidal Voronoi Particle domain decomposition method

    NASA Astrophysics Data System (ADS)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-04-01

    In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state is developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.

  6. An algorithm for fast elastic wave simulation using a vectorized finite difference operator

    NASA Astrophysics Data System (ADS)

    Malkoti, Ajay; Vedanti, Nimisha; Tiwari, Ram Krishna

    2018-07-01

    Modern geophysical imaging techniques exploit the full wavefield information which can be simulated numerically. These numerical simulations are computationally expensive due to several factors, such as a large number of time steps and nodes, big size of the derivative stencil and huge model size. Besides these constraints, it is also important to reformulate the numerical derivative operator for improved efficiency. In this paper, we have introduced a vectorized derivative operator over the staggered grid with shifted coordinate systems. The operator increases the efficiency of simulation by exploiting the fact that each variable can be represented in the form of a matrix. This operator allows updating all nodes of a variable defined on the staggered grid, in a manner similar to the collocated grid scheme and thereby reducing the computational run-time considerably. Here we demonstrate an application of this operator to simulate the seismic wave propagation in elastic media (Marmousi model), by discretizing the equations on a staggered grid. We have compared the performance of this operator on three programming languages, which reveals that it can increase the execution speed by a factor of at least 2-3 times for FORTRAN and MATLAB; and nearly 100 times for Python. We have further carried out various tests in MATLAB to analyze the effect of model size and the number of time steps on total simulation run-time. We find that there is an additional, though small, computational overhead for each step and it depends on total number of time steps used in the simulation. A MATLAB code package, 'FDwave', for the proposed simulation scheme is available upon request.

  7. Can one trust quantum simulators?

    PubMed

    Hauke, Philipp; Cucchietti, Fernando M; Tagliacozzo, Luca; Deutsch, Ivan; Lewenstein, Maciej

    2012-08-01

    Various fundamental phenomena of strongly correlated quantum systems such as high-T(c) superconductivity, the fractional quantum-Hall effect and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models which are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper (Feynman 1982 Int. J. Theor. Phys. 21 467), Richard Feynman suggested that such models might be solved by 'simulation' with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a 'quantum simulator,' would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question 'Can we trust quantum simulators?' is … to some extent.

  8. A framework for stochastic simulations and visualization of biological electron-transfer dynamics

    NASA Astrophysics Data System (ADS)

    Nakano, C. Masato; Byun, Hye Suk; Ma, Heng; Wei, Tao; El-Naggar, Mohamed Y.

    2015-08-01

    Electron transfer (ET) dictates a wide variety of energy-conversion processes in biological systems. Visualizing ET dynamics could provide key insight into understanding and possibly controlling these processes. We present a computational framework named VizBET to visualize biological ET dynamics, using an outer-membrane Mtr-Omc cytochrome complex in Shewanella oneidensis MR-1 as an example. Starting from X-ray crystal structures of the constituent cytochromes, molecular dynamics simulations are combined with homology modeling, protein docking, and binding free energy computations to sample the configuration of the complex as well as the change of the free energy associated with ET. This information, along with quantum-mechanical calculations of the electronic coupling, provides inputs to kinetic Monte Carlo (KMC) simulations of ET dynamics in a network of heme groups within the complex. Visualization of the KMC simulation results has been implemented as a plugin to the Visual Molecular Dynamics (VMD) software. VizBET has been used to reveal the nature of ET dynamics associated with novel nonequilibrium phase transitions in a candidate configuration of the Mtr-Omc complex due to electron-electron interactions.

  9. Ab initio molecular dynamics simulation of LiBr association in water

    NASA Astrophysics Data System (ADS)

    Izvekov, Sergei; Philpott, Michael R.

    2000-12-01

    A computationally economical scheme which unifies the density functional description of an ionic solute and the classical description of a solvent was developed. The density functional part of the scheme comprises Car-Parrinello and related formalisms. The substantial saving in the computer time is achieved by performing the ab initio molecular dynamics of the solute electronic structure in a relatively small basis set constructed from lowest energy Kohn-Sham orbitals calculated for a single anion in vacuum, instead of using plane wave basis. The methodology permits simulation of an ionic solution for longer time scales while keeping accuracy in the prediction of the solute electronic structure. As an example the association of the Li+-Br- ion-pair system in water is studied. The results of the combined molecular dynamics simulation are compared with that obtained from the classical simulation with ion-ion interaction described by the pair potential of Born-Huggins-Mayer type. The comparison reveals an important role played by the polarization of the Br- ion in the dynamics of ion pair association.

  10. ASIS v1.0: an adaptive solver for the simulation of atmospheric chemistry

    NASA Astrophysics Data System (ADS)

    Cariolle, Daniel; Moinat, Philippe; Teyssèdre, Hubert; Giraud, Luc; Josse, Béatrice; Lefèvre, Franck

    2017-04-01

    This article reports on the development and tests of the adaptive semi-implicit scheme (ASIS) solver for the simulation of atmospheric chemistry. To solve the ordinary differential equation systems associated with the time evolution of the species concentrations, ASIS adopts a one-step linearized implicit scheme with specific treatments of the Jacobian of the chemical fluxes. It conserves mass and has a time-stepping module to control the accuracy of the numerical solution. In idealized box-model simulations, ASIS gives results similar to the higher-order implicit schemes derived from the Rosenbrock's and Gear's methods and requires less computation and run time at the moderate precision required for atmospheric applications. When implemented in the MOCAGE chemical transport model and the Laboratoire de Météorologie Dynamique Mars general circulation model, the ASIS solver performs well and reveals weaknesses and limitations of the original semi-implicit solvers used by these two models. ASIS can be easily adapted to various chemical schemes and further developments are foreseen to increase its computational efficiency, and to include the computation of the concentrations of the species in aqueous-phase in addition to gas-phase chemistry.

  11. A Monte Carlo simulation and setup optimization of output efficiency to PGNAA thermal neutron using 252Cf neutrons

    NASA Astrophysics Data System (ADS)

    Zhang, Jin-Zhao; Tuo, Xian-Guo

    2014-07-01

    We present the design and optimization of a prompt γ-ray neutron activation analysis (PGNAA) thermal neutron output setup based on Monte Carlo simulations using MCNP5 computer code. In these simulations, the moderator materials, reflective materials, and structure of the PGNAA 252Cf neutrons of thermal neutron output setup are optimized. The simulation results reveal that the thin layer paraffin and the thick layer of heavy water moderating effect work best for the 252Cf neutron spectrum. Our new design shows a significantly improved performance of the thermal neutron flux and flux rate, that are increased by 3.02 times and 3.27 times, respectively, compared with the conventional neutron source design.

  12. Representation of Precipitation in a Decade-long Continental-Scale Convection-Resolving Climate Simulation

    NASA Astrophysics Data System (ADS)

    Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.

    2017-12-01

    The representation of moist convection in climate models represents a major challenge, due to the small scales involved. Regional climate simulations using horizontal resolutions of O(1km) allow to explicitly resolve deep convection leading to an improved representation of the water cycle. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. A new version of the Consortium for Small-Scale Modeling weather and climate model (COSMO) is capable of exploiting new supercomputer architectures employing GPU accelerators, and allows convection-resolving climate simulations on computational domains spanning continents and time periods up to one decade. We present results from a decade-long, convection-resolving climate simulation on a European-scale computational domain. The simulation has a grid spacing of 2.2 km, 1536x1536x60 grid points, covers the period 1999-2008, and is driven by the ERA-Interim reanalysis. Specifically we present an evaluation of hourly rainfall using a wide range of data sets, including several rain-gauge networks and a remotely-sensed lightning data set. Substantial improvements are found in terms of the diurnal cycles of precipitation amount, wet-hour frequency and all-hour 99th percentile. However the results also reveal substantial differences between regions with and without strong orographic forcing. Furthermore we present an index for deep-convective activity based on the statistics of vertical motion. Comparison of the index with lightning data shows that the convection-resolving climate simulations are able to reproduce important features of the annual cycle of deep convection in Europe. Leutwyler D., D. Lüthi, N. Ban, O. Fuhrer, and C. Schär (2017): Evaluation of the Convection-Resolving Climate Modeling Approach on Continental Scales , J. Geophys. Res. Atmos., 122, doi:10.1002/2016JD026013.

  13. Interleaved concatenated codes: New perspectives on approaching the Shannon limit

    PubMed Central

    Viterbi, A. J.; Viterbi, A. M.; Sindhushayana, N. T.

    1997-01-01

    The last few years have witnessed a significant decrease in the gap between the Shannon channel capacity limit and what is practically achievable. Progress has resulted from novel extensions of previously known coding techniques involving interleaved concatenated codes. A considerable body of simulation results is now available, supported by an important but limited theoretical basis. This paper presents a computational technique which further ties simulation results to the known theory and reveals a considerable reduction in the complexity required to approach the Shannon limit. PMID:11038568

  14. Absolute Helmholtz free energy of highly anharmonic crystals: theory vs Monte Carlo.

    PubMed

    Yakub, Lydia; Yakub, Eugene

    2012-04-14

    We discuss the problem of the quantitative theoretical prediction of the absolute free energy for classical highly anharmonic solids. Helmholtz free energy of the Lennard-Jones (LJ) crystal is calculated accurately while accounting for both the anharmonicity of atomic vibrations and the pair and triple correlations in displacements of the atoms from their lattice sites. The comparison with most precise computer simulation data on sublimation and melting lines revealed that theoretical predictions are in excellent agreement with Monte Carlo simulation data in the whole range of temperatures and densities studied.

  15. Using the CAE technologies of engineering analysis for designing steam turbines at ZAO Ural Turbine Works

    NASA Astrophysics Data System (ADS)

    Goloshumova, V. N.; Kortenko, V. V.; Pokhoriler, V. L.; Kultyshev, A. Yu.; Ivanovskii, A. A.

    2008-08-01

    We describe the experience ZAO Ural Turbine Works specialists gained from mastering the series of CAD/CAE/CAM/PDM technologies, which are modern software tools of computer-aided engineering. We also present the results obtained from mathematical simulation of the process through which high-and intermediate-pressure rotors are heated for revealing the most thermally stressed zones, as well as the results from mathematical simulation of a new design of turbine cylinder shells for improving the maneuverability of these turbines.

  16. Constant pH simulations of pH responsive polymers

    NASA Astrophysics Data System (ADS)

    Sharma, Arjun; Smith, J. D.; Walters, Keisha B.; Rick, Steven W.

    2016-12-01

    Polyacidic polymers can change structure over a narrow range of pH in a competition between the hydrophobic effect, which favors a compact state, and electrostatic repulsion, which favors an extended state. Constant pH molecular dynamics computer simulations of poly(methacrylic acid) reveal that there are two types of structural changes, one local and one global, which make up the overall response. The local structural response depends on the tacticity of the polymer and leads to different cooperative effects for polymers with different stereochemistries, demonstrating both positive and negative cooperativities.

  17. Homogeneous nucleation and microstructure evolution in million-atom molecular dynamics simulation

    PubMed Central

    Shibuta, Yasushi; Oguchi, Kanae; Takaki, Tomohiro; Ohno, Munekazu

    2015-01-01

    Homogeneous nucleation from an undercooled iron melt is investigated by the statistical sampling of million-atom molecular dynamics (MD) simulations performed on a graphics processing unit (GPU). Fifty independent instances of isothermal MD calculations with one million atoms in a quasi-two-dimensional cell over a nanosecond reveal that the nucleation rate and the incubation time of nucleation as functions of temperature have characteristic shapes with a nose at the critical temperature. This indicates that thermally activated homogeneous nucleation occurs spontaneously in MD simulations without any inducing factor, whereas most previous studies have employed factors such as pressure, surface effect, and continuous cooling to induce nucleation. Moreover, further calculations over ten nanoseconds capture the microstructure evolution on the order of tens of nanometers from the atomistic viewpoint and the grain growth exponent is directly estimated. Our novel approach based on the concept of “melting pots in a supercomputer” is opening a new phase in computational metallurgy with the aid of rapid advances in computational environments. PMID:26311304

  18. Numerical simulation of a full-loop circulating fluidized bed under different operating conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Yupeng; Musser, Jordan M.; Li, Tingwen

    Both experimental and computational studies of the fluidization of high-density polyethylene (HDPE) particles in a small-scale full-loop circulating fluidized bed are conducted. Experimental measurements of pressure drop are taken at different locations along the bed. The solids circulation rate is measured with an advanced Particle Image Velocimetry (PIV) technique. The bed height of the quasi-static region in the standpipe is also measured. Comparative numerical simulations are performed with a Computational Fluid Dynamics solver utilizing a Discrete Element Method (CFD-DEM). This paper reports a detailed and direct comparison between CFD-DEM results and experimental data for realistic gas-solid fluidization in a full-loopmore » circulating fluidized bed system. The comparison reveals good agreement with respect to system component pressure drop and inventory height in the standpipe. In addition, the effect of different drag laws applied within the CFD simulation is examined and compared with experimental results.« less

  19. Evaluating the hydraulic and transport properties of peat soil using pore network modeling and X-ray micro computed tomography

    NASA Astrophysics Data System (ADS)

    Gharedaghloo, Behrad; Price, Jonathan S.; Rezanezhad, Fereidoun; Quinton, William L.

    2018-06-01

    Micro-scale properties of peat pore space and their influence on hydraulic and transport properties of peat soils have been given little attention so far. Characterizing the variation of these properties in a peat profile can increase our knowledge on the processes controlling contaminant transport through peatlands. As opposed to the common macro-scale (or bulk) representation of groundwater flow and transport processes, a pore network model (PNM) simulates flow and transport processes within individual pores. Here, a pore network modeling code capable of simulating advective and diffusive transport processes through a 3D unstructured pore network was developed; its predictive performance was evaluated by comparing its results to empirical values and to the results of computational fluid dynamics (CFD) simulations. This is the first time that peat pore networks have been extracted from X-ray micro-computed tomography (μCT) images of peat deposits and peat pore characteristics evaluated in a 3D approach. Water flow and solute transport were modeled in the unstructured pore networks mapped directly from μCT images. The modeling results were processed to determine the bulk properties of peat deposits. Results portray the commonly observed decrease in hydraulic conductivity with depth, which was attributed to the reduction of pore radius and increase in pore tortuosity. The increase in pore tortuosity with depth was associated with more decomposed peat soil and decreasing pore coordination number with depth, which extended the flow path of fluid particles. Results also revealed that hydraulic conductivity is isotropic locally, but becomes anisotropic after upscaling to core-scale; this suggests the anisotropy of peat hydraulic conductivity observed in core-scale and field-scale is due to the strong heterogeneity in the vertical dimension that is imposed by the layered structure of peat soils. Transport simulations revealed that for a given solute, the effective diffusion coefficient decreases with depth due to the corresponding increase of diffusional tortuosity. Longitudinal dispersivity of peat also was computed by analyzing advective-dominant transport simulations that showed peat dispersivity is similar to the empirical values reported in the same peat soil; it is not sensitive to soil depth and does not vary much along the soil profile.

  20. Spent nuclear fuel assembly inspection using neutron computed tomography

    NASA Astrophysics Data System (ADS)

    Pope, Chad Lee

    The research presented here focuses on spent nuclear fuel assembly inspection using neutron computed tomography. Experimental measurements involving neutron beam transmission through a spent nuclear fuel assembly serve as benchmark measurements for an MCNP simulation model. Comparison of measured results to simulation results shows good agreement. Generation of tomography images from MCNP tally results was accomplished using adapted versions of built in MATLAB algorithms. Multiple fuel assembly models were examined to provide a broad set of conclusions. Tomography images revealing assembly geometric information including the fuel element lattice structure and missing elements can be obtained using high energy neutrons. A projection difference technique was developed which reveals the substitution of unirradiated fuel elements for irradiated fuel elements, using high energy neutrons. More subtle material differences such as altering the burnup of individual elements can be identified with lower energy neutrons provided the scattered neutron contribution to the image is limited. The research results show that neutron computed tomography can be used to inspect spent nuclear fuel assemblies for the purpose of identifying anomalies such as missing elements or substituted elements. The ability to identify anomalies in spent fuel assemblies can be used to deter diversion of material by increasing the risk of early detection as well as improve reprocessing facility operations by confirming the spent fuel configuration is as expected or allowing segregation if anomalies are detected.

  1. Evaluating variability with atomistic simulations: the effect of potential and calculation methodology on the modeling of lattice and elastic constants

    NASA Astrophysics Data System (ADS)

    Hale, Lucas M.; Trautt, Zachary T.; Becker, Chandler A.

    2018-07-01

    Atomistic simulations using classical interatomic potentials are powerful investigative tools linking atomic structures to dynamic properties and behaviors. It is well known that different interatomic potentials produce different results, thus making it necessary to characterize potentials based on how they predict basic properties. Doing so makes it possible to compare existing interatomic models in order to select those best suited for specific use cases, and to identify any limitations of the models that may lead to unrealistic responses. While the methods for obtaining many of these properties are often thought of as simple calculations, there are many underlying aspects that can lead to variability in the reported property values. For instance, multiple methods may exist for computing the same property and values may be sensitive to certain simulation parameters. Here, we introduce a new high-throughput computational framework that encodes various simulation methodologies as Python calculation scripts. Three distinct methods for evaluating the lattice and elastic constants of bulk crystal structures are implemented and used to evaluate the properties across 120 interatomic potentials, 18 crystal prototypes, and all possible combinations of unique lattice site and elemental model pairings. Analysis of the results reveals which potentials and crystal prototypes are sensitive to the calculation methods and parameters, and it assists with the verification of potentials, methods, and molecular dynamics software. The results, calculation scripts, and computational infrastructure are self-contained and openly available to support researchers in performing meaningful simulations.

  2. Response of selected binomial coefficients to varying degrees of matrix sparseness and to matrices with known data interrelationships

    USGS Publications Warehouse

    Archer, A.W.; Maples, C.G.

    1989-01-01

    Numerous departures from ideal relationships are revealed by Monte Carlo simulations of widely accepted binomial coefficients. For example, simulations incorporating varying levels of matrix sparseness (presence of zeros indicating lack of data) and computation of expected values reveal that not only are all common coefficients influenced by zero data, but also that some coefficients do not discriminate between sparse or dense matrices (few zero data). Such coefficients computationally merge mutually shared and mutually absent information and do not exploit all the information incorporated within the standard 2 ?? 2 contingency table; therefore, the commonly used formulae for such coefficients are more complicated than the actual range of values produced. Other coefficients do differentiate between mutual presences and absences; however, a number of these coefficients do not demonstrate a linear relationship to matrix sparseness. Finally, simulations using nonrandom matrices with known degrees of row-by-row similarities signify that several coefficients either do not display a reasonable range of values or are nonlinear with respect to known relationships within the data. Analyses with nonrandom matrices yield clues as to the utility of certain coefficients for specific applications. For example, coefficients such as Jaccard, Dice, and Baroni-Urbani and Buser are useful if correction of sparseness is desired, whereas the Russell-Rao coefficient is useful when sparseness correction is not desired. ?? 1989 International Association for Mathematical Geology.

  3. Cost-effectiveness of breast cancer screening policies using simulation.

    PubMed

    Gocgun, Y; Banjevic, D; Taghipour, S; Montgomery, N; Harvey, B J; Jardine, A K S; Miller, A B

    2015-08-01

    In this paper, we study breast cancer screening policies using computer simulation. We developed a multi-state Markov model for breast cancer progression, considering both the screening and treatment stages of breast cancer. The parameters of our model were estimated through data from the Canadian National Breast Cancer Screening Study as well as data in the relevant literature. Using computer simulation, we evaluated various screening policies to study the impact of mammography screening for age-based subpopulations in Canada. We also performed sensitivity analysis to examine the impact of certain parameters on number of deaths and total costs. The analysis comparing screening policies reveals that a policy in which women belonging to the 40-49 age group are not screened, whereas those belonging to the 50-59 and 60-69 age groups are screened once every 5 years, outperforms others with respect to cost per life saved. Our analysis also indicates that increasing the screening frequencies for the 50-59 and 60-69 age groups decrease mortality, and that the average number of deaths generally decreases with an increase in screening frequency. We found that screening annually for all age groups is associated with the highest costs per life saved. Our analysis thus reveals that cost per life saved increases with an increase in screening frequency. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. The Lagrangian Ensemble metamodel for simulating plankton ecosystems

    NASA Astrophysics Data System (ADS)

    Woods, J. D.

    2005-10-01

    This paper presents a detailed account of the Lagrangian Ensemble (LE) metamodel for simulating plankton ecosystems. It uses agent-based modelling to describe the life histories of many thousands of individual plankters. The demography of each plankton population is computed from those life histories. So too is bio-optical and biochemical feedback to the environment. The resulting “virtual ecosystem” is a comprehensive simulation of the plankton ecosystem. It is based on phenotypic equations for individual micro-organisms. LE modelling differs significantly from population-based modelling. The latter uses prognostic equations to compute demography and biofeedback directly. LE modelling diagnoses them from the properties of individual micro-organisms, whose behaviour is computed from prognostic equations. That indirect approach permits the ecosystem to adjust gracefully to changes in exogenous forcing. The paper starts with theory: it defines the Lagrangian Ensemble metamodel and explains how LE code performs a number of computations “behind the curtain”. They include budgeting chemicals, and deriving biofeedback and demography from individuals. The next section describes the practice of LE modelling. It starts with designing a model that complies with the LE metamodel. Then it describes the scenario for exogenous properties that provide the computation with initial and boundary conditions. These procedures differ significantly from those used in population-based modelling. The next section shows how LE modelling is used in research, teaching and planning. The practice depends largely on hindcasting to overcome the limits to predictability of weather forecasting. The scientific method explains observable ecosystem phenomena in terms of finer-grained processes that cannot be observed, but which are controlled by the basic laws of physics, chemistry and biology. What-If? Prediction ( WIP), used for planning, extends hindcasting by adding events that describe natural or man-made hazards and remedial actions. Verification is based on the Ecological Turing Test, which takes account of uncertainties in the observed and simulated versions of a target ecological phenomenon. The rest of the paper is devoted to a case study designed to show what LE modelling offers the biological oceanographer. The case study is presented in two parts. The first documents the WB model (Woods & Barkmann, 1994) and scenario used to simulate the ecosystem in a mesocosm moored in deep water off the Azores. The second part illustrates the emergent properties of that virtual ecosystem. The behaviour and development of an individual plankton lineage are revealed by an audit trail of the agent used in the computation. The fields of environmental properties reveal the impact of biofeedback. The fields of demographic properties show how changes in individuals cumulatively affect the birth and death rates of their population. This case study documents the virtual ecosystem used by Woods, Perilli and Barkmann (2005; hereafter WPB); to investigate the stability of simulations created by the Lagrangian Ensemble metamodel. The Azores virtual ecosystem was created and analysed on the Virtual Ecology Workbench (VEW) which is described briefly in the Appendix.

  5. Adaptive screening for depression--recalibration of an item bank for the assessment of depression in persons with mental and somatic diseases and evaluation in a simulated computer-adaptive test environment.

    PubMed

    Forkmann, Thomas; Kroehne, Ulf; Wirtz, Markus; Norra, Christine; Baumeister, Harald; Gauggel, Siegfried; Elhan, Atilla Halil; Tennant, Alan; Boecker, Maren

    2013-11-01

    This study conducted a simulation study for computer-adaptive testing based on the Aachen Depression Item Bank (ADIB), which was developed for the assessment of depression in persons with somatic diseases. Prior to computer-adaptive test simulation, the ADIB was newly calibrated. Recalibration was performed in a sample of 161 patients treated for a depressive syndrome, 103 patients from cardiology, and 103 patients from otorhinolaryngology (mean age 44.1, SD=14.0; 44.7% female) and was cross-validated in a sample of 117 patients undergoing rehabilitation for cardiac diseases (mean age 58.4, SD=10.5; 24.8% women). Unidimensionality of the itembank was checked and a Rasch analysis was performed that evaluated local dependency (LD), differential item functioning (DIF), item fit and reliability. CAT-simulation was conducted with the total sample and additional simulated data. Recalibration resulted in a strictly unidimensional item bank with 36 items, showing good Rasch model fit (item fit residuals<|2.5|) and no DIF or LD. CAT simulation revealed that 13 items on average were necessary to estimate depression in the range of -2 and +2 logits when terminating at SE≤0.32 and 4 items if using SE≤0.50. Receiver Operating Characteristics analysis showed that θ estimates based on the CAT algorithm have good criterion validity with regard to depression diagnoses (Area Under the Curve≥.78 for all cut-off criteria). The recalibration of the ADIB succeeded and the simulation studies conducted suggest that it has good screening performance in the samples investigated and that it may reasonably add to the improvement of depression assessment. © 2013.

  6. Can one trust quantum simulators?

    NASA Astrophysics Data System (ADS)

    Hauke, Philipp; Cucchietti, Fernando M.; Tagliacozzo, Luca; Deutsch, Ivan; Lewenstein, Maciej

    2012-08-01

    Various fundamental phenomena of strongly correlated quantum systems such as high-Tc superconductivity, the fractional quantum-Hall effect and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models which are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper (Feynman 1982 Int. J. Theor. Phys. 21 467), Richard Feynman suggested that such models might be solved by ‘simulation’ with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a ‘quantum simulator,’ would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question ‘Can we trust quantum simulators?’ is … to some extent.

  7. Low-order modeling of internal heat transfer in biomass particle pyrolysis

    DOE PAGES

    Wiggins, Gavin M.; Daw, C. Stuart; Ciesielski, Peter N.

    2016-05-11

    We present a computationally efficient, one-dimensional simulation methodology for biomass particle heating under conditions typical of fast pyrolysis. Our methodology is based on identifying the rate limiting geometric and structural factors for conductive heat transport in biomass particle models with realistic morphology to develop low-order approximations that behave appropriately. Comparisons of transient temperature trends predicted by our one-dimensional method with three-dimensional simulations of woody biomass particles reveal good agreement, if the appropriate equivalent spherical diameter and bulk thermal properties are used. Here, we conclude that, for particle sizes and heating regimes typical of fast pyrolysis, it is possible to simulatemore » biomass particle heating with reasonable accuracy and minimal computational overhead, even when variable size, aspherical shape, anisotropic conductivity, and complex, species-specific internal pore geometry are incorporated.« less

  8. Examination of the nature of lattice matched III V semiconductor interfaces using computer simulated molecular beam epitaxial growth I. AC/BC interfaces

    NASA Astrophysics Data System (ADS)

    Thomsen, M.; Ghaisas, S. V.; Madhukar, A.

    1987-07-01

    A previously developed computer simulation of molecular beam epitaxial growth of III-V semiconductors based on the configuration dependent reactive incorporation (CDRI) model is extended to allow for two different cation species. Attention is focussed on examining the nature of interfaces formed in lattice matched quantum well structures of the form AC/BC/AC(100). We consider cation species with substantially different effective diffusion lengths, as is the case with Al and Ga during the growth of their respective As compounds. The degree of intermixing occuring at the interface is seen to be dependent upon, among other growth parameters, the pressure of the group V species during growth. Examination of an intraplanar order parameter at the interfaces reveals the existence of short range clustering of the cation species.

  9. Biomaterial science meets computational biology.

    PubMed

    Hutmacher, Dietmar W; Little, J Paige; Pettet, Graeme J; Loessner, Daniela

    2015-05-01

    There is a pressing need for a predictive tool capable of revealing a holistic understanding of fundamental elements in the normal and pathological cell physiology of organoids in order to decipher the mechanoresponse of cells. Therefore, the integration of a systems bioengineering approach into a validated mathematical model is necessary to develop a new simulation tool. This tool can only be innovative by combining biomaterials science with computational biology. Systems-level and multi-scale experimental data are incorporated into a single framework, thus representing both single cells and collective cell behaviour. Such a computational platform needs to be validated in order to discover key mechano-biological factors associated with cell-cell and cell-niche interactions.

  10. Monte Carlo simulations on atropisomerism of thienotriazolodiazepines applicable to slow transition phenomena using potential energy surfaces by ab initio molecular orbital calculations.

    PubMed

    Morikami, Kenji; Itezono, Yoshiko; Nishimoto, Masahiro; Ohta, Masateru

    2014-01-01

    Compounds with a medium-sized flexible ring often show atropisomerism that is caused by the high-energy barriers between long-lived conformers that can be isolated and often have different biological properties to each other. In this study, the frequency of the transition between the two stable conformers, aS and aR, of thienotriazolodiazepine compounds with flexible 7-membered rings was estimated computationally by Monte Carlo (MC) simulations and validated experimentally by NMR experiments. To estimate the energy barriers for transitions as precisely as possible, the potential energy (PE) surfaces used in the MC simulations were calculated by molecular orbital (MO) methods. To accomplish the MC simulations with the MO-based PE surfaces in a practical central processing unit (CPU) time, the MO-based PE of each conformer was pre-calculated and stored before the MC simulations, and then only referred to during the MC simulations. The activation energies for transitions calculated by the MC simulations agreed well with the experimental ΔG determined by the NMR experiments. The analysis of the transition trajectories of the MC simulations revealed that the transition occurred not only through the transition states, but also through many different transition paths. Our computational methods gave us quantitative estimates of atropisomerism of the thienotriazolodiazepine compounds in a practical period of time, and the method could be applicable for other slow-dynamics phenomena that cannot be investigated by other atomistic simulations.

  11. All-atom molecular dynamics of virus capsids as drug targets

    DOE PAGES

    Perilla, Juan R.; Hadden, Jodi A.; Goh, Boon Chong; ...

    2016-04-29

    Virus capsids are protein shells that package the viral genome. Although their morphology and biological functions can vary markedly, capsids often play critical roles in regulating viral infection pathways. A detailed knowledge of virus capsids, including their dynamic structure, interactions with cellular factors, and the specific roles that they play in the replication cycle, is imperative for the development of antiviral therapeutics. The following Perspective introduces an emerging area of computational biology that focuses on the dynamics of virus capsids and capsid–protein assemblies, with particular emphasis on the effects of small-molecule drug binding on capsid structure, stability, and allosteric pathways.more » When performed at chemical detail, molecular dynamics simulations can reveal subtle changes in virus capsids induced by drug molecules a fraction of their size. Finally, the current challenges of performing all-atom capsid–drug simulations are discussed, along with an outlook on the applicability of virus capsid simulations to reveal novel drug targets.« less

  12. All-atom molecular dynamics of virus capsids as drug targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perilla, Juan R.; Hadden, Jodi A.; Goh, Boon Chong

    Virus capsids are protein shells that package the viral genome. Although their morphology and biological functions can vary markedly, capsids often play critical roles in regulating viral infection pathways. A detailed knowledge of virus capsids, including their dynamic structure, interactions with cellular factors, and the specific roles that they play in the replication cycle, is imperative for the development of antiviral therapeutics. The following Perspective introduces an emerging area of computational biology that focuses on the dynamics of virus capsids and capsid–protein assemblies, with particular emphasis on the effects of small-molecule drug binding on capsid structure, stability, and allosteric pathways.more » When performed at chemical detail, molecular dynamics simulations can reveal subtle changes in virus capsids induced by drug molecules a fraction of their size. Finally, the current challenges of performing all-atom capsid–drug simulations are discussed, along with an outlook on the applicability of virus capsid simulations to reveal novel drug targets.« less

  13. Self-Organization of Metal Nanoparticles in Light: Electrodynamics-Molecular Dynamics Simulations and Optical Binding Experiments.

    PubMed

    McCormack, Patrick; Han, Fei; Yan, Zijie

    2018-02-01

    Light-driven self-organization of metal nanoparticles (NPs) can lead to unique optical matter systems, yet simulation of such self-organization (i.e., optical binding) is a complex computational problem that increases nonlinearly with system size. Here we show that a combined electrodynamics-molecular dynamics simulation technique can simulate the trajectories and predict stable configurations of silver NPs in optical fields. The simulated dynamic equilibrium of a two-NP system matches the probability density of oscillations for two optically bound NPs obtained experimentally. The predicted stable configurations for up to eight NPs are further compared to experimental observations of silver NP clusters formed by optical binding in a Bessel beam. All configurations are confirmed to form in real systems, including pentagonal clusters with five-fold symmetry. Our combined simulations and experiments have revealed a diverse optical matter system formed by anisotropic optical binding interactions, providing a new strategy to discover artificial materials.

  14. An Investigation of Molecular Docking and Molecular Dynamic Simulation on Imidazopyridines as B-Raf Kinase Inhibitors.

    PubMed

    Xie, Huiding; Li, Yupeng; Yu, Fang; Xie, Xiaoguang; Qiu, Kaixiong; Fu, Jijun

    2015-11-16

    In the recent cancer treatment, B-Raf kinase is one of key targets. Nowadays, a group of imidazopyridines as B-Raf kinase inhibitors have been reported. In order to investigate the interaction between this group of inhibitors and B-Raf kinase, molecular docking, molecular dynamic (MD) simulation and binding free energy (ΔGbind) calculation were performed in this work. Molecular docking was carried out to identify the key residues in the binding site, and MD simulations were performed to determine the detail binding mode. The results obtained from MD simulation reveal that the binding site is stable during the MD simulations, and some hydrogen bonds (H-bonds) in MD simulations are different from H-bonds in the docking mode. Based on the obtained MD trajectories, ΔGbind was computed by using Molecular Mechanics Generalized Born Surface Area (MM-GBSA), and the obtained energies are consistent with the activities. An energetic analysis reveals that both electrostatic and van der Waals contributions are important to ΔGbind, and the unfavorable polar solvation contribution results in the instability of the inhibitor with the lowest activity. These results are expected to understand the binding between B-Raf and imidazopyridines and provide some useful information to design potential B-Raf inhibitors.

  15. All-Atom Multiscale Molecular Dynamics Theory and Simulation of Self-Assembly, Energy Transfer and Structural Transition in Nanosystems

    NASA Astrophysics Data System (ADS)

    Espinosa Duran, John Michael

    The study of nanosystems and their emergent properties requires the development of multiscale computational models, theories and methods that preserve atomic and femtosecond resolution, to reveal details that cannot be resolved experimentally today. Considering this, three long time scale phenomena were studied using molecular dynamics and multiscale methods: self-assembly of organic molecules on graphite, energy transfer in nanosystems, and structural transition in vault nanoparticles. Molecular dynamics simulations of the self-assembly of alkoxybenzonitriles with different tail lengths on graphite were performed to learn about intermolecular interactions and phases exhibited by self-organized materials. This is important for the design of ordered self-assembled organic photovoltaic materials with greater efficiency than the disordered blends. Simulations revealed surface dynamical behaviors that cannot be resolved experimentally today due to the lack of spatiotemporal resolution. Atom-resolved structures predicted by simulations agreed with scanning tunneling microscopy images and unit cell measurements. Then, a multiscale theory based on the energy density as a field variable is developed to study energy transfer in nanoscale systems. For applications like photothermal microscopy or cancer phototherapy is required to understand how the energy is transferred to/from nanosystems. This multiscale theory could be applied in this context and here is tested for cubic nanoparticles immersed in water for energy being transferred to/from the nanoparticle. The theory predicts the energy transfer dynamics and reveals phenomena that cannot be described by current phenomenological theories. Finally, temperature-triggered structural transitions were revealed for vault nanoparticles using molecular dynamics and multiscale simulations. Vault is a football-shaped supramolecular assembly very distinct from the commonly observed icosahedral viruses. It has very promising applications in drug delivery and has been extensively studied experimentally. Sub-microsecond multiscale simulations at 310 K on the vault revealed the opening and closing of fractures near the shoulder while preserving the overall structure. This fracture mechanism could explain the uptake and release of small drugs while maintaining the overall structure. Higher temperature simulations show the generation of large fractures near the waist, which enables interaction of the external medium with the inner vault residues. Simulation results agreed with microscopy and spectroscopy measurements, and revealed new structures and mechanisms.

  16. Improving Students' Understanding and Perception of Cell Theory in School Biology Using a Computer-Based Instruction Simulation Program

    ERIC Educational Resources Information Center

    Kiboss, Joel; Wekesa, Eric; Ndirangu, Mwangi

    2006-01-01

    A survey by the Kenya National Examination Council (KNEC) revealed that students' academic performance and interest in secondary school biology has been generally poor. This has been attributed to the current methods of instruction and the lack of instructional resources amenable to the study and proper understanding of such complex areas as cell…

  17. Effectiveness of a Computer-Mediated Simulations Program in School Biology on Pupils' Learning Outcomes in Cell Theory

    ERIC Educational Resources Information Center

    Kiboss, Joel K.; Ndirangu, Mwangi; Wekesa, Eric W.

    2004-01-01

    Biology knowledge and understanding is important not only for the conversion of the loftiest dreams into reality for a better life of individuals but also for preparing secondary pupils for such fields as agriculture, medicine, biotechnology, and genetic engineering. But a recent study has revealed that many aspects of school science (biology…

  18. Tree and forest effects on air quality and human health in the United States

    Treesearch

    David J. Nowak; Satoshi Hirabayashi; Allison Bodine; Eric Greenfield

    2014-01-01

    Trees remove air pollution by the interception of particulate matter on plant surfaces and the absorption of gaseous pollutants through the leaf stomata. However, the magnitude and value of the effects of trees and forests on air quality and human health across the United States remains unknown. Computer simulations with local environmental data reveal that trees and...

  19. Development of a parallel FE simulator for modeling the whole trans-scale failure process of rock from meso- to engineering-scale

    NASA Astrophysics Data System (ADS)

    Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao

    2017-01-01

    Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.

  20. A comparison of traditional physical laboratory and computer-simulated laboratory experiences in relation to engineering undergraduate students' conceptual understandings of a communication systems topic

    NASA Astrophysics Data System (ADS)

    Javidi, Giti

    2005-07-01

    This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical group's (n = 40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation group's (n = 40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups' attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the qualitative research has uncovered several issues not explored by the quantitative research. It was concluded that incorporating the recommendations acquired from the qualitative research, especially elements of incorporating hardware experience to avoid lack of hands-on skills, into the laboratory pedagogy should help improve students' experience regardless of the environment in which the laboratory is conducted.

  1. Computational analysis of an aortic valve jet

    NASA Astrophysics Data System (ADS)

    Shadden, Shawn C.; Astorino, Matteo; Gerbeau, Jean-Frédéric

    2009-11-01

    In this work we employ a coupled FSI scheme using an immersed boundary method to simulate flow through a realistic deformable, 3D aortic valve model. This data was used to compute Lagrangian coherent structures, which revealed flow separation from the valve leaflets during systole, and correspondingly, the boundary between the jet of ejected fluid and the regions of separated, recirculating flow. Advantages of computing LCS in multi-dimensional FSI models of the aortic valve are twofold. For one, the quality and effectiveness of existing clinical indices used to measure aortic jet size can be tested by taking advantage of the accurate measure of the jet area derived from LCS. Secondly, as an ultimate goal, a reliable computational framework for the assessment of the aortic valve stenosis could be developed.

  2. Computational Aerodynamic Modeling of Small Quadcopter Vehicles

    NASA Technical Reports Server (NTRS)

    Yoon, Seokkwan; Ventura Diaz, Patricia; Boyd, D. Douglas; Chan, William M.; Theodore, Colin R.

    2017-01-01

    High-fidelity computational simulations have been performed which focus on rotor-fuselage and rotor-rotor aerodynamic interactions of small quad-rotor vehicle systems. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, low Mach number preconditioning, and hybrid turbulence modeling. Computational results for isolated rotors are shown to compare well with available experimental data. Computational results in hover reveal the differences between a conventional configuration where the rotors are mounted above the fuselage and an unconventional configuration where the rotors are mounted below the fuselage. Complex flow physics in forward flight is investigated. The goal of this work is to demonstrate that understanding of interactional aerodynamics can be an important factor in design decisions regarding rotor and fuselage placement for next-generation multi-rotor drones.

  3. Efficient Characterization of Protein Cavities within Molecular Simulation Trajectories: trj_cavity.

    PubMed

    Paramo, Teresa; East, Alexandra; Garzón, Diana; Ulmschneider, Martin B; Bond, Peter J

    2014-05-13

    Protein cavities and tunnels are critical in determining phenomena such as ligand binding, molecular transport, and enzyme catalysis. Molecular dynamics (MD) simulations enable the exploration of the flexibility and conformational plasticity of protein cavities, extending the information available from static experimental structures relevant to, for example, drug design. Here, we present a new tool (trj_cavity) implemented within the GROMACS ( www.gromacs.org ) framework for the rapid identification and characterization of cavities detected within MD trajectories. trj_cavity is optimized for usability and computational efficiency and is applicable to the time-dependent analysis of any cavity topology, and optional specialized descriptors can be used to characterize, for example, protein channels. Its novel grid-based algorithm performs an efficient neighbor search whose calculation time is linear with system size, and a comparison of performance with other widely used cavity analysis programs reveals an orders-of-magnitude improvement in the computational cost. To demonstrate its potential for revealing novel mechanistic insights, trj_cavity has been used to analyze long-time scale simulation trajectories for three diverse protein cavity systems. This has helped to reveal, respectively, the lipid binding mechanism in the deep hydrophobic cavity of a soluble mite-allergen protein, Der p 2; a means for shuttling carbohydrates between the surface-exposed substrate-binding and catalytic pockets of a multidomain, membrane-proximal pullulanase, PulA; and the structural basis for selectivity in the transmembrane pore of a voltage-gated sodium channel (NavMs), embedded within a lipid bilayer environment. trj_cavity is available for download under an open-source license ( http://sourceforge.net/projects/trjcavity ). A simplified, GROMACS-independent version may also be compiled.

  4. Molecular electronics: insight from first-principles transport simulations.

    PubMed

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2010-01-01

    Conduction properties of nanoscale contacts can be studied using first-principles simulations. Such calculations give insight into details behind the conductance that is not readily available in experiments. For example, we may learn how the bonding conditions of a molecule to the electrodes affect the electronic transport. Here we describe key computational ingredients and discuss these in relation to simulations for scanning tunneling microscopy (STM) experiments with C60 molecules where the experimental geometry is well characterized. We then show how molecular dynamics simulations may be combined with transport calculations to study more irregular situations, such as the evolution of a nanoscale contact with the mechanically controllable break-junction technique. Finally we discuss calculations of inelastic electron tunnelling spectroscopy as a characterization technique that reveals information about the atomic arrangement and transport channels.

  5. Confirmation of a realistic reactor model for BNCT dosimetry at the TRIGA Mainz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziegner, Markus, E-mail: Markus.Ziegner.fl@ait.ac.at; Schmitz, Tobias; Hampel, Gabriele

    2014-11-01

    Purpose: In order to build up a reliable dose monitoring system for boron neutron capture therapy (BNCT) applications at the TRIGA reactor in Mainz, a computer model for the entire reactor was established, simulating the radiation field by means of the Monte Carlo method. The impact of different source definition techniques was compared and the model was validated by experimental fluence and dose determinations. Methods: The depletion calculation code ORIGEN2 was used to compute the burn-up and relevant material composition of each burned fuel element from the day of first reactor operation to its current core. The material composition ofmore » the current core was used in a MCNP5 model of the initial core developed earlier. To perform calculations for the region outside the reactor core, the model was expanded to include the thermal column and compared with the previously established ATTILA model. Subsequently, the computational model is simplified in order to reduce the calculation time. Both simulation models are validated by experiments with different setups using alanine dosimetry and gold activation measurements with two different types of phantoms. Results: The MCNP5 simulated neutron spectrum and source strength are found to be in good agreement with the previous ATTILA model whereas the photon production is much lower. Both MCNP5 simulation models predict all experimental dose values with an accuracy of about 5%. The simulations reveal that a Teflon environment favorably reduces the gamma dose component as compared to a polymethyl methacrylate phantom. Conclusions: A computer model for BNCT dosimetry was established, allowing the prediction of dosimetric quantities without further calibration and within a reasonable computation time for clinical applications. The good agreement between the MCNP5 simulations and experiments demonstrates that the ATTILA model overestimates the gamma dose contribution. The detailed model can be used for the planning of structural modifications in the thermal column irradiation channel or the use of different irradiation sites than the thermal column, e.g., the beam tubes.« less

  6. Modeling the dynamics of chromosomal alteration progression in cervical cancer: A computational model

    PubMed Central

    2017-01-01

    Computational modeling has been applied to simulate the heterogeneity of cancer behavior. The development of Cervical Cancer (CC) is a process in which the cell acquires dynamic behavior from non-deleterious and deleterious mutations, exhibiting chromosomal alterations as a manifestation of this dynamic. To further determine the progression of chromosomal alterations in precursor lesions and CC, we introduce a computational model to study the dynamics of deleterious and non-deleterious mutations as an outcome of tumor progression. The analysis of chromosomal alterations mediated by our model reveals that multiple deleterious mutations are more frequent in precursor lesions than in CC. Cells with lethal deleterious mutations would be eliminated, which would mitigate cancer progression; on the other hand, cells with non-deleterious mutations would become dominant, which could predispose them to cancer progression. The study of somatic alterations through computer simulations of cancer progression provides a feasible pathway for insights into the transformation of cell mechanisms in humans. During cancer progression, tumors may acquire new phenotype traits, such as the ability to invade and metastasize or to become clinically important when they develop drug resistance. Non-deleterious chromosomal alterations contribute to this progression. PMID:28723940

  7. Structural anomaly and dynamic heterogeneity in cycloether/water binary mixtures: Signatures from composition dependent dynamic fluorescence measurements and computer simulations

    NASA Astrophysics Data System (ADS)

    Indra, Sandipa; Guchhait, Biswajit; Biswas, Ranjit

    2016-03-01

    We have performed steady state UV-visible absorption and time-resolved fluorescence measurements and computer simulations to explore the cosolvent mole fraction induced changes in structural and dynamical properties of water/dioxane (Diox) and water/tetrahydrofuran (THF) binary mixtures. Diox is a quadrupolar solvent whereas THF is a dipolar one although both are cyclic molecules and represent cycloethers. The focus here is on whether these cycloethers can induce stiffening and transition of water H-bond network structure and, if they do, whether such structural modification differentiates the chemical nature (dipolar or quadrupolar) of the cosolvent molecules. Composition dependent measured fluorescence lifetimes and rotation times of a dissolved dipolar solute (Coumarin 153, C153) suggest cycloether mole-fraction (XTHF/Diox) induced structural transition for both of these aqueous binary mixtures in the 0.1 ≤ XTHF/Diox ≤ 0.2 regime with no specific dependence on the chemical nature. Interestingly, absorption measurements reveal stiffening of water H-bond structure in the presence of both the cycloethers at a nearly equal mole-fraction, XTHF/Diox ˜ 0.05. Measurements near the critical solution temperature or concentration indicate no role for the solution criticality on the anomalous structural changes. Evidences for cycloether aggregation at very dilute concentrations have been found. Simulated radial distribution functions reflect abrupt changes in respective peak heights at those mixture compositions around which fluorescence measurements revealed structural transition. Simulated water coordination numbers (for a dissolved C153) and number of H-bonds also exhibit minima around these cosolvent concentrations. In addition, several dynamic heterogeneity parameters have been simulated for both the mixtures to explore the effects of structural transition and chemical nature of cosolvent on heterogeneous dynamics of these systems. Simulated four-point dynamic susceptibility suggests formation of clusters inducing local heterogeneity in the solution structure.

  8. Improving Simulated Annealing by Recasting it as a Non-Cooperative Game

    NASA Technical Reports Server (NTRS)

    Wolpert, David; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.

  9. Unique migration of a dental needle into the parapharyngeal space: successful removal by an intraoral approach and simulation for tracking visibility in X-ray fluoroscopy.

    PubMed

    Okumura, Yuri; Hidaka, Hiroshi; Seiji, Kazumasa; Nomura, Kazuhiro; Takata, Yusuke; Suzuki, Takahiro; Katori, Yukio

    2015-02-01

    The first objective was to describe a novel case of migration of a broken dental needle into the parapharyngeal space. The second was to address the importance of simulation elucidating visualization of such a thin needle under X-ray fluoroscopy. Clinical case records (including computed tomography [CT] and surgical approaches) were reviewed, and a simulation experiment using a head phantom was conducted using the same settings applied intraoperatively. A 36-year-old man was referred after failure to locate a broken 31-G dental needle. Computed tomography revealed migration of the needle into the parapharyngeal space. Intraoperative X-ray fluoroscopy failed to identify the needle, so a steel wire was applied as a reference during X-ray to locate the foreign body. The needle was successfully removed using an intraoral approach with tonsillectomy under surgical microscopy. The simulation showed that the dental needle was able to be identified only after applying an appropriate compensating filter, contrasting with the steel wire. Meticulous preoperative simulation regarding visual identification of dental needle foreign bodies is mandatory. Intraoperative radiography and an intraoral approach with tonsillectomy under surgical microscopy offer benefits for accessing the parapharyngeal space, specifically for cases medial to the great vessels. © The Author(s) 2014.

  10. Non-Adiabatic Molecular Dynamics Methods for Materials Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furche, Filipp; Parker, Shane M.; Muuronen, Mikko J.

    2017-04-04

    The flow of radiative energy in light-driven materials such as photosensitizer dyes or photocatalysts is governed by non-adiabatic transitions between electronic states and cannot be described within the Born-Oppenheimer approximation commonly used in electronic structure theory. The non-adiabatic molecular dynamics (NAMD) methods based on Tully surface hopping and time-dependent density functional theory developed in this project have greatly extended the range of molecular materials that can be tackled by NAMD simulations. New algorithms to compute molecular excited state and response properties efficiently were developed. Fundamental limitations of common non-linear response methods were discovered and characterized. Methods for accurate computations ofmore » vibronic spectra of materials such as black absorbers were developed and applied. It was shown that open-shell TDDFT methods capture bond breaking in NAMD simulations, a longstanding challenge for single-reference molecular dynamics simulations. The methods developed in this project were applied to study the photodissociation of acetaldehyde and revealed that non-adiabatic effects are experimentally observable in fragment kinetic energy distributions. Finally, the project enabled the first detailed NAMD simulations of photocatalytic water oxidation by titania nanoclusters, uncovering the mechanism of this fundamentally important reaction for fuel generation and storage.« less

  11. Optimizing isotope substitution in graphene for thermal conductivity minimization by genetic algorithm driven molecular simulations

    NASA Astrophysics Data System (ADS)

    Davies, Michael; Ganapathysubramanian, Baskar; Balasubramanian, Ganesh

    2017-03-01

    We present results from a computational framework integrating genetic algorithm and molecular dynamics simulations to systematically design isotope engineered graphene structures for reduced thermal conductivity. In addition to the effect of mass disorder, our results reveal the importance of atomic distribution on thermal conductivity for the same isotopic concentration. Distinct groups of isotope-substituted graphene sheets are identified based on the atomic composition and distribution. Our results show that in structures with equiatomic compositions, the enhanced scattering by lattice vibrations results in lower thermal conductivities due to the absence of isotopic clusters.

  12. Program scheme using common source lines in channel stacked NAND flash memory with layer selection by multilevel operation

    NASA Astrophysics Data System (ADS)

    Kim, Do-Bin; Kwon, Dae Woong; Kim, Seunghyun; Lee, Sang-Ho; Park, Byung-Gook

    2018-02-01

    To obtain high channel boosting potential and reduce a program disturbance in channel stacked NAND flash memory with layer selection by multilevel (LSM) operation, a new program scheme using boosted common source line (CSL) is proposed. The proposed scheme can be achieved by applying proper bias to each layer through its own CSL. Technology computer-aided design (TCAD) simulations are performed to verify the validity of the new method in LSM. Through TCAD simulation, it is revealed that the program disturbance characteristics is effectively improved by the proposed scheme.

  13. Are metastases from metastases clinical relevant? Computer modelling of cancer spread in a case of hepatocellular carcinoma.

    PubMed

    Bethge, Anja; Schumacher, Udo; Wree, Andreas; Wedemann, Gero

    2012-01-01

    Metastasis formation remains an enigmatic process and one of the main questions recently asked is whether metastases are able to generate further metastases. Different models have been proposed to answer this question; however, their clinical significance remains unclear. Therefore a computer model was developed that permits comparison of the different models quantitatively with clinical data and that additionally predicts the outcome of treatment interventions. The computer model is based on discrete events simulation approach. On the basis of a case from an untreated patient with hepatocellular carcinoma and its multiple metastases in the liver, it was evaluated whether metastases are able to metastasise and in particular if late disseminated tumour cells are still capable to form metastases. Additionally, the resection of the primary tumour was simulated. The simulation results were compared with clinical data. The simulation results reveal that the number of metastases varies significantly between scenarios where metastases metastasise and scenarios where they do not. In contrast, the total tumour mass is nearly unaffected by the two different modes of metastasis formation. Furthermore, the results provide evidence that metastasis formation is an early event and that late disseminated tumour cells are still capable of forming metastases. Simulations also allow estimating how the resection of the primary tumour delays the patient's death. The simulation results indicate that for this particular case of a hepatocellular carcinoma late metastases, i.e., metastases from metastases, are irrelevant in terms of total tumour mass. Hence metastases seeded from metastases are clinically irrelevant in our model system. Only the first metastases seeded from the primary tumour contribute significantly to the tumour burden and thus cause the patient's death.

  14. Experimentally valid predictions of muscle force and EMG in models of motor-unit function are most sensitive to neural properties.

    PubMed

    Keenan, Kevin G; Valero-Cuevas, Francisco J

    2007-09-01

    Computational models of motor-unit populations are the objective implementations of the hypothesized mechanisms by which neural and muscle properties give rise to electromyograms (EMGs) and force. However, the variability/uncertainty of the parameters used in these models--and how they affect predictions--confounds assessing these hypothesized mechanisms. We perform a large-scale computational sensitivity analysis on the state-of-the-art computational model of surface EMG, force, and force variability by combining a comprehensive review of published experimental data with Monte Carlo simulations. To exhaustively explore model performance and robustness, we ran numerous iterative simulations each using a random set of values for nine commonly measured motor neuron and muscle parameters. Parameter values were sampled across their reported experimental ranges. Convergence after 439 simulations found that only 3 simulations met our two fitness criteria: approximating the well-established experimental relations for the scaling of EMG amplitude and force variability with mean force. An additional 424 simulations preferentially sampling the neighborhood of those 3 valid simulations converged to reveal 65 additional sets of parameter values for which the model predictions approximate the experimentally known relations. We find the model is not sensitive to muscle properties but very sensitive to several motor neuron properties--especially peak discharge rates and recruitment ranges. Therefore to advance our understanding of EMG and muscle force, it is critical to evaluate the hypothesized neural mechanisms as implemented in today's state-of-the-art models of motor unit function. We discuss experimental and analytical avenues to do so as well as new features that may be added in future implementations of motor-unit models to improve their experimental validity.

  15. Dynamics of Numerics & Spurious Behaviors in CFD Computations. Revised

    NASA Technical Reports Server (NTRS)

    Yee, Helen C.; Sweby, Peter K.

    1997-01-01

    The global nonlinear behavior of finite discretizations for constant time steps and fixed or adaptive grid spacings is studied using tools from dynamical systems theory. Detailed analysis of commonly used temporal and spatial discretizations for simple model problems is presented. The role of dynamics in the understanding of long time behavior of numerical integration and the nonlinear stability, convergence, and reliability of using time-marching approaches for obtaining steady-state numerical solutions in computational fluid dynamics (CFD) is explored. The study is complemented with examples of spurious behavior observed in steady and unsteady CFD computations. The CFD examples were chosen to illustrate non-apparent spurious behavior that was difficult to detect without extensive grid and temporal refinement studies and some knowledge from dynamical systems theory. Studies revealed the various possible dangers of misinterpreting numerical simulation of realistic complex flows that are constrained by available computing power. In large scale computations where the physics of the problem under study is not well understood and numerical simulations are the only viable means of solution, extreme care must be taken in both computation and interpretation of the numerical data. The goal of this paper is to explore the important role that dynamical systems theory can play in the understanding of the global nonlinear behavior of numerical algorithms and to aid the identification of the sources of numerical uncertainties in CFD.

  16. A molecular dynamics approach for predicting the glass transition temperature and plasticization effect in amorphous pharmaceuticals.

    PubMed

    Gupta, Jasmine; Nunes, Cletus; Jonnalagadda, Sriramakamal

    2013-11-04

    The objectives of this study were as follows: (i) To develop an in silico technique, based on molecular dynamics (MD) simulations, to predict glass transition temperatures (Tg) of amorphous pharmaceuticals. (ii) To computationally study the effect of plasticizer on Tg. (iii) To investigate the intermolecular interactions using radial distribution function (RDF). Amorphous sucrose and water were selected as the model compound and plasticizer, respectively. MD simulations were performed using COMPASS force field and isothermal-isobaric ensembles. The specific volumes of amorphous cells were computed in the temperature range of 440-265 K. The characteristic "kink" observed in volume-temperature curves, in conjunction with regression analysis, defined the Tg. The MD computed Tg values were 367 K, 352 K and 343 K for amorphous sucrose containing 0%, 3% and 5% w/w water, respectively. The MD technique thus effectively simulated the plasticization effect of water; and the corresponding Tg values were in reasonable agreement with theoretical models and literature reports. The RDF measurements revealed strong hydrogen bond interactions between sucrose hydroxyl oxygens and water oxygen. Steric effects led to weak interactions between sucrose acetal oxygens and water oxygen. MD is thus a powerful predictive tool for probing temperature and water effects on the stability of amorphous systems during drug development.

  17. An investigation of the inelastic behaviour of trabecular bone during the press-fit implantation of a tibial component in total knee arthroplasty.

    PubMed

    Kelly, N; Cawley, D T; Shannon, F J; McGarry, J P

    2013-11-01

    The stress distribution and plastic deformation of peri-prosthetic trabecular bone during press-fit tibial component implantation in total knee arthroplasty is investigated using experimental and finite element techniques. It is revealed that the computed stress distribution, implantation force and plastic deformation in the trabecular bone is highly dependent on the plasticity formulation implemented. By incorporating pressure dependent yielding using a crushable foam plasticity formulation to simulate the trabecular bone during implantation, highly localised stress concentrations and plastic deformation are computed at the bone-implant interface. If the pressure dependent yield is neglected using a traditional von Mises plasticity formulation, a significantly different stress distribution and implantation force is computed in the peri-prosthetic trabecular bone. The results of the study highlight the importance of: (i) simulating the insertion process of press-fit stem implantation; (ii) implementing a pressure dependent plasticity formulation, such as the crushable foam plasticity formulation, for the trabecular bone; (iii) incorporating friction at the implant-bone interface during stem insertion. Simulation of the press-fit implantation process with an appropriate pressure dependent plasticity formulation should be implemented in the design and assessment of arthroplasty prostheses. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. MODPATH-LGR; documentation of a computer program for particle tracking in shared-node locally refined grids by using MODFLOW-LGR

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.

    2011-01-01

    The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.

  19. Computationally Guided Design of Polymer Electrolytes for Battery Applications

    NASA Astrophysics Data System (ADS)

    Wang, Zhen-Gang; Webb, Michael; Savoie, Brett; Miller, Thomas

    We develop an efficient computational framework for guiding the design of polymer electrolytes for Li battery applications. Short-times molecular dynamics (MD) simulations are employed to identify key structural and dynamic features in the solvation and motion of Li ions, such as the structure of the solvation shells, the spatial distribution of solvation sites, and the polymer segmental mobility. Comparative studies on six polyester-based polymers and polyethylene oxide (PEO) yield good agreement with experimental data on the ion conductivities, and reveal significant differences in the ion diffusion mechanism between PEO and the polyesters. The molecular insights from the MD simulations are used to build a chemically specific coarse-grained model in the spirit of the dynamic bond percolation model of Druger, Ratner and Nitzan. We apply this coarse-grained model to characterize Li ion diffusion in several existing and yet-to-be synthesized polyethers that differ by oxygen content and backbone stiffness. Good agreement is obtained between the predictions of the coarse-grained model and long-timescale atomistic MD simulations, thus providing validation of the model. Our study predicts higher Li ion diffusivity in poly(trimethylene oxide-alt-ethylene oxide) than in PEO. These results demonstrate the potential of this computational framework for rapid screening of new polymer electrolytes based on ion diffusivity.

  20. Efficiency of the neighbor-joining method in reconstructing deep and shallow evolutionary relationships in large phylogenies.

    PubMed

    Kumar, S; Gadagkar, S R

    2000-12-01

    The neighbor-joining (NJ) method is widely used in reconstructing large phylogenies because of its computational speed and the high accuracy in phylogenetic inference as revealed in computer simulation studies. However, most computer simulation studies have quantified the overall performance of the NJ method in terms of the percentage of branches inferred correctly or the percentage of replications in which the correct tree is recovered. We have examined other aspects of its performance, such as the relative efficiency in correctly reconstructing shallow (close to the external branches of the tree) and deep branches in large phylogenies; the contribution of zero-length branches to topological errors in the inferred trees; and the influence of increasing the tree size (number of sequences), evolutionary rate, and sequence length on the efficiency of the NJ method. Results show that the correct reconstruction of deep branches is no more difficult than that of shallower branches. The presence of zero-length branches in realized trees contributes significantly to the overall error observed in the NJ tree, especially in large phylogenies or slowly evolving genes. Furthermore, the tree size does not influence the efficiency of NJ in reconstructing shallow and deep branches in our simulation study, in which the evolutionary process is assumed to be homogeneous in all lineages.

  1. Site Identification by Ligand Competitive Saturation (SILCS) simulations for fragment-based drug design.

    PubMed

    Faller, Christina E; Raman, E Prabhu; MacKerell, Alexander D; Guvench, Olgun

    2015-01-01

    Fragment-based drug design (FBDD) involves screening low molecular weight molecules ("fragments") that correspond to functional groups found in larger drug-like molecules to determine their binding to target proteins or nucleic acids. Based on the principle of thermodynamic additivity, two fragments that bind nonoverlapping nearby sites on the target can be combined to yield a new molecule whose binding free energy is the sum of those of the fragments. Experimental FBDD approaches, like NMR and X-ray crystallography, have proven very useful but can be expensive in terms of time, materials, and labor. Accordingly, a variety of computational FBDD approaches have been developed that provide different levels of detail and accuracy.The Site Identification by Ligand Competitive Saturation (SILCS) method of computational FBDD uses all-atom explicit-solvent molecular dynamics (MD) simulations to identify fragment binding. The target is "soaked" in an aqueous solution with multiple fragments having different identities. The resulting computational competition assay reveals what small molecule types are most likely to bind which regions of the target. From SILCS simulations, 3D probability maps of fragment binding called "FragMaps" can be produced. Based on the probabilities relative to bulk, SILCS FragMaps can be used to determine "Grid Free Energies (GFEs)," which provide per-atom contributions to fragment binding affinities. For essentially no additional computational overhead relative to the production of the FragMaps, GFEs can be used to compute Ligand Grid Free Energies (LGFEs) for arbitrarily complex molecules, and these LGFEs can be used to rank-order the molecules in accordance with binding affinities.

  2. Computation material science of structural-phase transformation in casting aluminium alloys

    NASA Astrophysics Data System (ADS)

    Golod, V. M.; Dobosh, L. Yu

    2017-04-01

    Successive stages of computer simulation the formation of the casting microstructure under non-equilibrium conditions of crystallization of multicomponent aluminum alloys are presented. On the basis of computer thermodynamics and heat transfer during solidification of macroscale shaped castings are specified the boundary conditions of local heat exchange at mesoscale modeling of non-equilibrium formation the solid phase and of the component redistribution between phases during coalescence of secondary dendrite branches. Computer analysis of structural - phase transitions based on the principle of additive physico-chemical effect of the alloy components in the process of diffusional - capillary morphological evolution of the dendrite structure and the o of local dendrite heterogeneity which stochastic nature and extent are revealed under metallographic study and modeling by the Monte Carlo method. The integrated computational materials science tools at researches of alloys are focused and implemented on analysis the multiple-factor system of casting processes and prediction of casting microstructure.

  3. Computer-based simulation training to improve learning outcomes in mannequin-based simulation exercises.

    PubMed

    Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J

    2011-08-10

    To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.

  4. Developability assessment of clinical drug products with maximum absorbable doses.

    PubMed

    Ding, Xuan; Rose, John P; Van Gelder, Jan

    2012-05-10

    Maximum absorbable dose refers to the maximum amount of an orally administered drug that can be absorbed in the gastrointestinal tract. Maximum absorbable dose, or D(abs), has proved to be an important parameter for quantifying the absorption potential of drug candidates. The purpose of this work is to validate the use of D(abs) in a developability assessment context, and to establish appropriate protocol and interpretation criteria for this application. Three methods for calculating D(abs) were compared by assessing how well the methods predicted the absorption limit for a set of real clinical candidates. D(abs) was calculated for these clinical candidates by means of a simple equation and two computer simulation programs, GastroPlus and an program developed at Eli Lilly and Company. Results from single dose escalation studies in Phase I clinical trials were analyzed to identify the maximum absorbable doses for these compounds. Compared to the clinical results, the equation and both simulation programs provide conservative estimates of D(abs), but in general D(abs) from the computer simulations are more accurate, which may find obvious advantage for the simulations in developability assessment. Computer simulations also revealed the complex behavior associated with absorption saturation and suggested in most cases that the D(abs) limit is not likely to be achieved in a typical clinical dose range. On the basis of the validation findings, an approach is proposed for assessing absorption potential, and best practices are discussed for the use of D(abs) estimates to inform clinical formulation development strategies. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Activation pathway of Src kinase reveals intermediate states as novel targets for drug design

    PubMed Central

    Shukla, Diwakar; Meng, Yilin; Roux, Benoît; Pande, Vijay S.

    2014-01-01

    Unregulated activation of Src kinases leads to aberrant signaling, uncontrolled growth, and differentiation of cancerous cells. Reaching a complete mechanistic understanding of large scale conformational transformations underlying the activation of kinases could greatly help in the development of therapeutic drugs for the treatment of these pathologies. In principle, the nature of conformational transition could be modeled in silico via atomistic molecular dynamics simulations, although this is very challenging due to the long activation timescales. Here, we employ a computational paradigm that couples transition pathway techniques and Markov state model-based massively distributed simulations for mapping the conformational landscape of c-src tyrosine kinase. The computations provide the thermodynamics and kinetics of kinase activation for the first time, and help identify key structural intermediates. Furthermore, the presence of a novel allosteric site in an intermediate state of c-src that could be potentially utilized for drug design is predicted. PMID:24584478

  6. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.

    PubMed

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level.

  7. Quantum annealing versus classical machine learning applied to a simplified computational biology problem

    PubMed Central

    Li, Richard Y.; Di Felice, Rosa; Rohs, Remo; Lidar, Daniel A.

    2018-01-01

    Transcription factors regulate gene expression, but how these proteins recognize and specifically bind to their DNA targets is still debated. Machine learning models are effective means to reveal interaction mechanisms. Here we studied the ability of a quantum machine learning approach to predict binding specificity. Using simplified datasets of a small number of DNA sequences derived from actual binding affinity experiments, we trained a commercially available quantum annealer to classify and rank transcription factor binding. The results were compared to state-of-the-art classical approaches for the same simplified datasets, including simulated annealing, simulated quantum annealing, multiple linear regression, LASSO, and extreme gradient boosting. Despite technological limitations, we find a slight advantage in classification performance and nearly equal ranking performance using the quantum annealer for these fairly small training data sets. Thus, we propose that quantum annealing might be an effective method to implement machine learning for certain computational biology problems. PMID:29652405

  8. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery

    PubMed Central

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  9. Performance analysis of a parallel Monte Carlo code for simulating solar radiative transfer in cloudy atmospheres using CUDA-enabled NVIDIA GPU

    NASA Astrophysics Data System (ADS)

    Russkova, Tatiana V.

    2017-11-01

    One tool to improve the performance of Monte Carlo methods for numerical simulation of light transport in the Earth's atmosphere is the parallel technology. A new algorithm oriented to parallel execution on the CUDA-enabled NVIDIA graphics processor is discussed. The efficiency of parallelization is analyzed on the basis of calculating the upward and downward fluxes of solar radiation in both a vertically homogeneous and inhomogeneous models of the atmosphere. The results of testing the new code under various atmospheric conditions including continuous singlelayered and multilayered clouds, and selective molecular absorption are presented. The results of testing the code using video cards with different compute capability are analyzed. It is shown that the changeover of computing from conventional PCs to the architecture of graphics processors gives more than a hundredfold increase in performance and fully reveals the capabilities of the technology used.

  10. Using Computational Cognitive Modeling to Diagnose Possible Sources of Aviation Error

    NASA Technical Reports Server (NTRS)

    Byrne, M. D.; Kirlik, Alex

    2003-01-01

    We present a computational model of a closed-loop, pilot-aircraft-visual scene-taxiway system created to shed light on possible sources of taxi error. Creating the cognitive aspects of the model using ACT-R required us to conduct studies with subject matter experts to identify experiential adaptations pilots bring to taxiing. Five decision strategies were found, ranging from cognitively-intensive but precise, to fast, frugal but robust. We provide evidence for the model by comparing its behavior to a NASA Ames Research Center simulation of Chicago O'Hare surface operations. Decision horizons were highly variable; the model selected the most accurate strategy given time available. We found a signature in the simulation data of the use of globally robust heuristics to cope with short decision horizons as revealed by errors occurring most frequently at atypical taxiway geometries or clearance routes. These data provided empirical support for the model.

  11. Computational model of polarized actin cables and cytokinetic actin ring formation in budding yeast

    PubMed Central

    Tang, Haosu; Bidone, Tamara C.

    2015-01-01

    The budding yeast actin cables and contractile ring are important for polarized growth and division, revealing basic aspects of cytoskeletal function. To study these formin-nucleated structures, we built a 3D computational model with actin filaments represented as beads connected by springs. Polymerization by formins at the bud tip and bud neck, crosslinking, severing, and myosin pulling, are included. Parameter values were estimated from prior experiments. The model generates actin cable structures and dynamics similar to those of wild type and formin deletion mutant cells. Simulations with increased polymerization rate result in long, wavy cables. Simulated pulling by type V myosin stretches actin cables. Increasing the affinity of actin filaments for the bud neck together with reduced myosin V pulling promotes the formation of a bundle of antiparallel filaments at the bud neck, which we suggest as a model for the assembly of actin filaments to the contractile ring. PMID:26538307

  12. A computer simulation of the plasma leakage through a vascular prosthesis made of expanded polytetrafluoroethylene.

    PubMed

    Tabata, R; Kobayashi, T; Mori, A; Matsuno, S; Watarida, S; Onoe, M; Sugita, T; Shiraisi, S; Nojima, T

    1993-04-01

    We explored the blood-retaining mechanism of a vascular prosthesis made of expanded polytetrafluoroethylene through analysis of its structure and physicochemical properties. Plasma leakage through this vascular prosthesis was simulated by computer to explore its etiology. These examinations disclosed that leakage is dependent upon the inner pressure and the density of fibers. In other words, the study revealed that the mean distance between fibers constituting the wall of the expanded polytetrafluoroethylene vascular prosthesis is increased by tension (that is, inner pressure), resulting in an increased probability of leakage. It was additionally found that a thin membrane is formed on the polytetrafluoroethylene surface if blood in contact with the surface is dried. This membrane was found to reduce the water-repelling property of polytetrafluoroethylene and to make it impossible to preserve the inter-fiber liquid surface, thus causing leakage through the expanded polytetrafluoroethylene vascular prosthesis.

  13. Conformational Heterogeneity of Bax Helix 9 Dimer for Apoptotic Pore Formation

    NASA Astrophysics Data System (ADS)

    Liao, Chenyi; Zhang, Zhi; Kale, Justin; Andrews, David W.; Lin, Jialing; Li, Jianing

    2016-07-01

    Helix α9 of Bax protein can dimerize in the mitochondrial outer membrane (MOM) and lead to apoptotic pores. However, it remains unclear how different conformations of the dimer contribute to the pore formation on the molecular level. Thus we have investigated various conformational states of the α9 dimer in a MOM model — using computer simulations supplemented with site-specific mutagenesis and crosslinking of the α9 helices. Our data not only confirmed the critical membrane environment for the α9 stability and dimerization, but also revealed the distinct lipid-binding preference of the dimer in different conformational states. In our proposed pathway, a crucial iso-parallel dimer that mediates the conformational transition was discovered computationally and validated experimentally. The corroborating evidence from simulations and experiments suggests that, helix α9 assists Bax activation via the dimer heterogeneity and interactions with specific MOM lipids, which eventually facilitate proteolipidic pore formation in apoptosis regulation.

  14. A fractional calculus perspective of distributed propeller design

    NASA Astrophysics Data System (ADS)

    Tenreiro Machado, J.; Galhano, Alexandra M.

    2018-02-01

    A new generation of aircraft with distributed propellers leads to operational performances superior to those exhibited by standard designs. Computational simulations and experimental tests show a reduction of fuel consumption and noise. This paper proposes an analogy between aerodynamics and electrical circuits. The model reveals properties similar to those of fractional-order systems and gives a deeper insight into the dynamics of multi-propeller coupling.

  15. Infection Threshold for an Epidemic Model in Site and Bond Percolation Worlds

    NASA Astrophysics Data System (ADS)

    Sakisaka, Yukio; Yoshimura, Jin; Takeuchi, Yasuhiro; Sugiura, Koji; Tainaka, Kei-ichi

    2010-02-01

    We investigate an epidemic model on a square lattice with two protection treatments: prevention and quarantine. To explore the effects of both treatments, we apply the site and bond percolations. Computer simulations reveal that the threshold between endemic and disease-free phases can be represented by a single scaling law. The mean-field theory qualitatively predicts such infection dynamics and the scaling law.

  16. Investigation of the effect of reducing scan resolution on simulated information-augmented sawing

    Treesearch

    Suraphan Thawornwong; Luis G. Occena; Daniel L. Schmoldt

    2000-01-01

    In the past few years, computed tomography (CT) scanning technology has been applied to the detection of internal defects in hardwood logs for the purpose of obtaining a priori information that can be used to arrive at better log breakdown or sawing decisions. Since today sawyers cannot even see the inside of the log until the log faces are revealed...

  17. Lumber value differences from reduced CT spatial resolution and simulated log sawing

    Treesearch

    Suraphan Thawornwong; Luis G. Occena; Daniel L. Schmoldt

    2003-01-01

    In the past few years, computed tomography (CT) scanning technology has been applied to the detection of internal defects in hardwood logs for the purpose of obtaining a priori information that can be used to arrive at better log sawing decisions. Because sawyers currently cannot even see the inside of a log until the log faces are revealed by sawing, there is little...

  18. A Numerical Study on Microwave Coagulation Therapy

    DTIC Science & Technology

    2013-01-01

    hepatocellular carcinoma (small size liver tumor). Through extensive numerical simulations, we reveal the mathematical relationships between some critical parameters in the therapy, including input power, frequency, temperature, and regions of impact. It is shown that these relationships can be approximated using simple polynomial functions. Compared to solutions of partial differential equations, these functions are significantly easier to compute and simpler to analyze for engineering design and clinical

  19. Ratio of produced gas to produced water from DOE's EDNA Delcambre No. 1 geopressured-geothermal aquifer gas well test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, L.A.; Randolph, P.L.

    1979-01-01

    A paper presented by the Institute of Gas Technology (IGT) at the Third Geopressured-Geothermal Energy Conference hypothesized that the high ratio of produced gas to produced water from the No. 1 sand in the Edna Delcambre No. 1 well was due to free gas trapped in pores by imbibition over geological time. This hypothesis was examined in relation to preliminary test data which reported only average gas to water ratios over the roughly 2-day steps in flow rate. Subsequent public release of detailed test data revealed substantial departures from the previously reported computer simulation results. Also, data now in themore » public domain reveal the existence of a gas cap on the aquifier tested. This paper describes IGT's efforts to match the observed gas/water production with computer simulation. Two models for the occurrence and production of gas in excess of that dissolved in the brine have been used. One model considers the gas to be dispersed in pores by imbibition, and the other model considers the gas as a nearby free gas cap above the aquifier. The studies revealed that the dispersed gas model characteristically gave the wrong shape to plots of gas production on the gas/water ratio plots such that no reasonable match to the flow data could be achieved. The free gas cap model gave a characteristically better shape to the production plots and could provide an approximate fit to the data of the edge of the free gas cap is only about 400 feet from the well.Because the geological structure maps indicate the free gas cap to be several thousand feet away and the computer simulation results match the distance to the nearby Delcambre Nos. 4 and 4A wells, it appears that the source of the excess free gas in the test of the No. 1 sand may be from these nearby wells. The gas source is probably a separate gas zone and is brought into contact with the No. 1 sand via a conduit around the No. 4 well.« less

  20. Close encounters with DNA

    PubMed Central

    Maffeo, C.; Yoo, J.; Comer, J.; Wells, D. B.; Luan, B.; Aksimentiev, A.

    2014-01-01

    Over the past ten years, the all-atom molecular dynamics method has grown in the scale of both systems and processes amenable to it and in its ability to make quantitative predictions about the behavior of experimental systems. The field of computational DNA research is no exception, witnessing a dramatic increase in the size of systems simulated with atomic resolution, the duration of individual simulations and the realism of the simulation outcomes. In this topical review, we describe the hallmark physical properties of DNA from the perspective of all-atom simulations. We demonstrate the amazing ability of such simulations to reveal the microscopic physical origins of experimentally observed phenomena and we review the frustrating limitations associated with imperfections of present atomic force fields and inadequate sampling. The review is focused on the following four physical properties of DNA: effective electric charge, response to an external mechanical force, interaction with other DNA molecules and behavior in an external electric field. PMID:25238560

  1. Real-Time Model and Simulation Architecture for Half- and Full-Bridge Modular Multilevel Converters

    NASA Astrophysics Data System (ADS)

    Ashourloo, Mojtaba

    This work presents an equivalent model and simulation architecture for real-time electromagnetic transient analysis of either half-bridge or full-bridge modular multilevel converter (MMC) with 400 sub-modules (SMs) per arm. The proposed CPU/FPGA-based architecture is optimized for the parallel implementation of the presented MMC model on the FPGA and is beneficiary of a high-throughput floating-point computational engine. The developed real-time simulation architecture is capable of simulating MMCs with 400 SMs per arm at 825 nanoseconds. To address the difficulties of the sorting process implementation, a modified Odd-Even Bubble sorting is presented in this work. The comparison of the results under various test scenarios reveals that the proposed real-time simulator is representing the system responses in the same way of its corresponding off-line counterpart obtained from the PSCAD/EMTDC program.

  2. Close encounters with DNA.

    PubMed

    Maffeo, C; Yoo, J; Comer, J; Wells, D B; Luan, B; Aksimentiev, A

    2014-10-15

    Over the past ten years, the all-atom molecular dynamics method has grown in the scale of both systems and processes amenable to it and in its ability to make quantitative predictions about the behavior of experimental systems. The field of computational DNA research is no exception, witnessing a dramatic increase in the size of systems simulated with atomic resolution, the duration of individual simulations and the realism of the simulation outcomes. In this topical review, we describe the hallmark physical properties of DNA from the perspective of all-atom simulations. We demonstrate the amazing ability of such simulations to reveal the microscopic physical origins of experimentally observed phenomena. We also discuss the frustrating limitations associated with imperfections of present atomic force fields and inadequate sampling. The review is focused on the following four physical properties of DNA: effective electric charge, response to an external mechanical force, interaction with other DNA molecules and behavior in an external electric field.

  3. Computational Models of Protein Kinematics and Dynamics: Beyond Simulation

    PubMed Central

    Gipson, Bryant; Hsu, David; Kavraki, Lydia E.; Latombe, Jean-Claude

    2016-01-01

    Physics-based simulation represents a powerful method for investigating the time-varying behavior of dynamic protein systems at high spatial and temporal resolution. Such simulations, however, can be prohibitively difficult or lengthy for large proteins or when probing the lower-resolution, long-timescale behaviors of proteins generally. Importantly, not all questions about a protein system require full space and time resolution to produce an informative answer. For instance, by avoiding the simulation of uncorrelated, high-frequency atomic movements, a larger, domain-level picture of protein dynamics can be revealed. The purpose of this review is to highlight the growing body of complementary work that goes beyond simulation. In particular, this review focuses on methods that address kinematics and dynamics, as well as those that address larger organizational questions and can quickly yield useful information about the long-timescale behavior of a protein. PMID:22524225

  4. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  5. Large eddy simulation for atmospheric boundary layer flow over flat and complex terrains

    NASA Astrophysics Data System (ADS)

    Han, Yi; Stoellinger, Michael; Naughton, Jonathan

    2016-09-01

    In this work, we present Large Eddy Simulation (LES) results of atmospheric boundary layer (ABL) flow over complex terrain with neutral stratification using the OpenFOAM-based simulator for on/offshore wind farm applications (SOWFA). The complete work flow to investigate the LES for the ABL over real complex terrain is described including meteorological-tower data analysis, mesh generation and case set-up. New boundary conditions for the lateral and top boundaries are developed and validated to allow inflow and outflow as required in complex terrain simulations. The turbulent inflow data for the terrain simulation is generated using a precursor simulation of a flat and neutral ABL. Conditionally averaged met-tower data is used to specify the conditions for the flat precursor simulation and is also used for comparison with the simulation results of the terrain LES. A qualitative analysis of the simulation results reveals boundary layer separation and recirculation downstream of a prominent ridge that runs across the simulation domain. Comparisons of mean wind speed, standard deviation and direction between the computed results and the conditionally averaged tower data show a reasonable agreement.

  6. Revealing the distribution of transmembrane currents along the dendritic tree of a neuron from extracellular recordings

    PubMed Central

    Cserpán, Dorottya; Meszéna, Domokos; Wittner, Lucia; Tóth, Kinga; Ulbert, István; Somogyvári, Zoltán

    2017-01-01

    Revealing the current source distribution along the neuronal membrane is a key step on the way to understanding neural computations; however, the experimental and theoretical tools to achieve sufficient spatiotemporal resolution for the estimation remain to be established. Here, we address this problem using extracellularly recorded potentials with arbitrarily distributed electrodes for a neuron of known morphology. We use simulations of models with varying complexity to validate the proposed method and to give recommendations for experimental applications. The method is applied to in vitro data from rat hippocampus. PMID:29148974

  7. Skin hydration analysis by experiment and computer simulations and its implications for diapered skin.

    PubMed

    Saadatmand, M; Stone, K J; Vega, V N; Felter, S; Ventura, S; Kasting, G; Jaworska, J

    2017-11-01

    Experimental work on skin hydration is technologically challenging, and mostly limited to observations where environmental conditions are constant. In some cases, like diapered baby skin, such work is practically unfeasible, yet it is important to understand potential effects of diapering on skin condition. To overcome this challenge, in part, we developed a computer simulation model of reversible transient skin hydration effects. Skin hydration model by Li et al. (Chem Eng Sci, 138, 2015, 164) was further developed to simulate transient exposure conditions where relative humidity (RH), wind velocity, air, and skin temperature can be any function of time. Computer simulations of evaporative water loss (EWL) decay after different occlusion times were compared with experimental data to calibrate the model. Next, we used the model to investigate EWL and SC thickness in different diapering scenarios. Key results from the experimental work were: (1) For occlusions by RH=100% and free water longer than 30 minutes the absorbed amount of water is almost the same; (2) Longer occlusion times result in higher water absorption by the SC. The EWL decay and skin water content predictions were in agreement with experimental data. Simulations also revealed that skin under occlusion hydrates mainly because the outflux is blocked, not because it absorbs water from the environment. Further, simulations demonstrated that hydration level is sensitive to time, RH and/or free water on skin. In simulated diapering scenarios, skin maintained hydration content very close to the baseline conditions without a diaper for the entire duration of a 24 hours period. Different diapers/diaper technologies are known to have different profiles in terms of their ability to provide wetness protection, which can result in consumer-noticeable differences in wetness. Simulation results based on published literature using data from a number of different diapers suggest that diapered skin hydrates within ranges considered reversible. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Big Data Processing for a Central Texas Groundwater Case Study

    NASA Astrophysics Data System (ADS)

    Cantu, A.; Rivera, O.; Martínez, A.; Lewis, D. H.; Gentle, J. N., Jr.; Fuentes, G.; Pierce, S. A.

    2016-12-01

    As computational methods improve, scientists are able to expand the level and scale of experimental simulation and testing that is completed for case studies. This study presents a comparative analysis of multiple models for the Barton Springs segment of the Edwards aquifer. Several numerical simulations using state-mandated MODFLOW models ran on Stampede, a High Performance Computing system housed at the Texas Advanced Computing Center, were performed for multiple scenario testing. One goal of this multidisciplinary project aims to visualize and compare the output data of the groundwater model using the statistical programming language R to find revealing data patterns produced by different pumping scenarios. Presenting data in a friendly post-processing format is covered in this paper. Visualization of the data and creating workflows applicable to the management of the data are tasks performed after data extraction. Resulting analyses provide an example of how supercomputing can be used to accelerate evaluation of scientific uncertainty and geological knowledge in relation to policy and management decisions. Understanding the aquifer behavior helps policy makers avoid negative impact on the endangered species, environmental services and aids in maximizing the aquifer yield.

  9. Statistical benchmark for BosonSampling

    NASA Astrophysics Data System (ADS)

    Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas

    2016-03-01

    Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.

  10. Improved atomistic simulation of diffusion and sorption in metal oxides

    NASA Astrophysics Data System (ADS)

    Skouras, E. D.; Burganos, V. N.; Payatakes, A. C.

    2001-01-01

    Gas diffusion and sorption on the surface of metal oxides are investigated using atomistic simulations, that make use of two different force fields for the description of the intramolecular and intermolecular interactions. MD and MC computations are presented and estimates of the mean residence time, Henry's constant, and the heat of adsorption are provided for various common gases (CO, CO2, O2, CH4, Xe), and semiconducting substrates that hold promise for gas sensor applications (SnO2, BaTiO3). Comparison is made between the performance of a simple, first generation force field (Universal) and a more detailed, second generation field (COMPASS) under the same conditions and the same assumptions regarding the generation of the working configurations. It is found that the two force fields yield qualitatively similar results in all cases examined here. However, direct comparison with experimental data reveals that the accuracy of the COMPASS-based computations is not only higher than that of the first generation force field but exceeds even that of published specialized methods, based on ab initio computations.

  11. Extending atomistic simulation timescale in solid/liquid systems: crystal growth from solution by a parallel-replica dynamics and continuum hybrid method.

    PubMed

    Lu, Chun-Yaung; Voter, Arthur F; Perez, Danny

    2014-01-28

    Deposition of solid material from solution is ubiquitous in nature. However, due to the inherent complexity of such systems, this process is comparatively much less understood than deposition from a gas or vacuum. Further, the accurate atomistic modeling of such systems is computationally expensive, therefore leaving many intriguing long-timescale phenomena out of reach. We present an atomistic/continuum hybrid method for extending the simulation timescales of dynamics at solid/liquid interfaces. We demonstrate the method by simulating the deposition of Ag on Ag (001) from solution with a significant speedup over standard MD. The results reveal specific features of diffusive deposition dynamics, such as a dramatic increase in the roughness of the film.

  12. Gradient gravitational search: An efficient metaheuristic algorithm for global optimization.

    PubMed

    Dash, Tirtharaj; Sahu, Prabhat K

    2015-05-30

    The adaptation of novel techniques developed in the field of computational chemistry to solve the concerned problems for large and flexible molecules is taking the center stage with regard to efficient algorithm, computational cost and accuracy. In this article, the gradient-based gravitational search (GGS) algorithm, using analytical gradients for a fast minimization to the next local minimum has been reported. Its efficiency as metaheuristic approach has also been compared with Gradient Tabu Search and others like: Gravitational Search, Cuckoo Search, and Back Tracking Search algorithms for global optimization. Moreover, the GGS approach has also been applied to computational chemistry problems for finding the minimal value potential energy of two-dimensional and three-dimensional off-lattice protein models. The simulation results reveal the relative stability and physical accuracy of protein models with efficient computational cost. © 2015 Wiley Periodicals, Inc.

  13. (Un)Folding Mechanisms of the FBP28 WW Domain in Explicit Solvent Revealed by Multiple Rare Event Simulation Methods

    PubMed Central

    Juraszek, Jarek; Bolhuis, Peter G.

    2010-01-01

    Abstract We report a numerical study of the (un)folding routes of the truncated FBP28 WW domain at ambient conditions using a combination of four advanced rare event molecular simulation techniques. We explore the free energy landscape of the native state, the unfolded state, and possible intermediates, with replica exchange molecular dynamics. Subsequent application of bias-exchange metadynamics yields three tentative unfolding pathways at room temperature. Using these paths to initiate a transition path sampling simulation reveals the existence of two major folding routes, differing in the formation order of the two main hairpins, and in hydrophobic side-chain interactions. Having established that the hairpin strand separation distances can act as reasonable reaction coordinates, we employ metadynamics to compute the unfolding barriers and find that the barrier with the lowest free energy corresponds with the most likely pathway found by transition path sampling. The unfolding barrier at 300 K is ∼17 kBT ≈ 42 kJ/mol, in agreement with the experimental unfolding rate constant. This work shows that combining several powerful simulation techniques provides a more complete understanding of the kinetic mechanism of protein folding. PMID:20159161

  14. MEFA (multiepitope fusion antigen)-Novel Technology for Structural Vaccinology, Proof from Computational and Empirical Immunogenicity Characterization of an Enterotoxigenic Escherichia coli (ETEC) Adhesin MEFA

    PubMed Central

    Duan, Qiangde; Lee, Kuo Hao; Nandre, Rahul M; Garcia, Carolina; Chen, Jianhan; Zhang, Weiping

    2017-01-01

    Vaccine development often encounters the challenge of virulence heterogeneity. Enterotoxigenic Escherichia coli (ETEC) bacteria producing immunologically heterogeneous virulence factors are a leading cause of children’s diarrhea and travelers’ diarrhea. Currently, we do not have licensed vaccines against ETEC bacteria. While conventional methods continue to make progress but encounter challenge, new computational and structure-based approaches are explored to accelerate ETEC vaccine development. In this study, we applied a structural vaccinology concept to construct a structure-based multiepitope fusion antigen (MEFA) to carry representing epitopes of the seven most important ETEC adhesins [CFA/I, CFA/II (CS1–CS3), CFA/IV (CS4–CS6)], simulated antigenic structure of the CFA/I/II/IV MEFA with computational atomistic modeling and simulation, characterized immunogenicity in mouse immunization, and examined the potential of structure-informed vaccine design for ETEC vaccine development. A tag-less recombinant MEFA protein (CFA/I/II/IV MEFA) was effectively expressed and extracted. Molecular dynamics simulations indicated that this MEFA immunogen maintained a stable secondary structure and presented epitopes on the protein surface. Empirical data showed that mice immunized with the tagless CFA/I/II/IV MEFA developed strong antigen-specific antibody responses, and mouse serum antibodies significantly inhibited in vitro adherence of bacteria expressing these seven adhesins. These results revealed congruence of antigen immunogenicity between computational simulation and empirical mouse immunization and indicated this tag-less CFA/I/II/IV MEFA potentially an antigen for a broadly protective ETEC vaccine, suggesting a potential application of MEFA-based structural vaccinology for vaccine design against ETEC and likely other pathogens. PMID:28944092

  15. Retinal Image Simulation of Subjective Refraction Techniques.

    PubMed

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  16. Comparison of DAC and MONACO DSMC Codes with Flat Plate Simulation

    NASA Technical Reports Server (NTRS)

    Padilla, Jose F.

    2010-01-01

    Various implementations of the direct simulation Monte Carlo (DSMC) method exist in academia, government and industry. By comparing implementations, deficiencies and merits of each can be discovered. This document reports comparisons between DSMC Analysis Code (DAC) and MONACO. DAC is NASA's standard DSMC production code and MONACO is a research DSMC code developed in academia. These codes have various differences; in particular, they employ distinct computational grid definitions. In this study, DAC and MONACO are compared by having each simulate a blunted flat plate wind tunnel test, using an identical volume mesh. Simulation expense and DSMC metrics are compared. In addition, flow results are compared with available laboratory data. Overall, this study revealed that both codes, excluding grid adaptation, performed similarly. For parallel processing, DAC was generally more efficient. As expected, code accuracy was mainly dependent on physical models employed.

  17. A breakthrough for experiencing and understanding simulated physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1988-01-01

    The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.

  18. Computational Evaluation of Mg–Salen Compounds as Subsurface Fluid Tracers: Molecular Dynamics Simulations in Toluene–Water Mixtures and Clay Mineral Nanopores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greathouse, Jeffery A.; Boyle, Timothy J.; Kemp, Richard A.

    Molecular tracers that can be selectively placed underground and uniquely identified at the surface using simple on-site spectroscopic methods would significantly enhance subsurface fluid monitoring capabilities. To ensure their widespread utility, the solubility of these tracers must be easily tuned to oil- or water-wet conditions as well as reducing or eliminating their propensity to adsorb onto subsurface rock and/or mineral phases. In this work, molecular dynamics simulations were used to investigate the relative solubilities and mineral surface adsorption properties of three candidate tracer compounds comprising Mg–salen derivatives of varying degrees of hydrophilic character. Simulations in water–toluene liquid mixtures indicate thatmore » the partitioning of each Mg–salen compound relative to the interface is strongly influenced by the degree of hydrophobicity of the compound. Simulations of these complexes in fluid-filled mineral nanopores containing neutral (kaolinite) and negatively charged (montmorillonite) mineral surfaces reveal that adsorption tendencies depend upon a variety of parameters, including tracer chemical properties, mineral surface type, and solvent type (water or toluene). Simulation snapshots and averaged density profiles reveal insight into the solvation and adsorption mechanisms that control the partitioning of these complexes in mixed liquid phases and nanopore environments. As a result, this work demonstrates the utility of molecular simulation in the design and screening of molecular tracers for use in subsurface applications.« less

  19. Computational Evaluation of Mg–Salen Compounds as Subsurface Fluid Tracers: Molecular Dynamics Simulations in Toluene–Water Mixtures and Clay Mineral Nanopores

    DOE PAGES

    Greathouse, Jeffery A.; Boyle, Timothy J.; Kemp, Richard A.

    2018-04-11

    Molecular tracers that can be selectively placed underground and uniquely identified at the surface using simple on-site spectroscopic methods would significantly enhance subsurface fluid monitoring capabilities. To ensure their widespread utility, the solubility of these tracers must be easily tuned to oil- or water-wet conditions as well as reducing or eliminating their propensity to adsorb onto subsurface rock and/or mineral phases. In this work, molecular dynamics simulations were used to investigate the relative solubilities and mineral surface adsorption properties of three candidate tracer compounds comprising Mg–salen derivatives of varying degrees of hydrophilic character. Simulations in water–toluene liquid mixtures indicate thatmore » the partitioning of each Mg–salen compound relative to the interface is strongly influenced by the degree of hydrophobicity of the compound. Simulations of these complexes in fluid-filled mineral nanopores containing neutral (kaolinite) and negatively charged (montmorillonite) mineral surfaces reveal that adsorption tendencies depend upon a variety of parameters, including tracer chemical properties, mineral surface type, and solvent type (water or toluene). Simulation snapshots and averaged density profiles reveal insight into the solvation and adsorption mechanisms that control the partitioning of these complexes in mixed liquid phases and nanopore environments. As a result, this work demonstrates the utility of molecular simulation in the design and screening of molecular tracers for use in subsurface applications.« less

  20. A study of deoxyribonucleotide metabolism and its relation to DNA synthesis. Supercomputer simulation and model-system analysis.

    PubMed

    Heinmets, F; Leary, R H

    1991-06-01

    A model system (1) was established to analyze purine and pyrimidine metabolism. This system has been expanded to include macrosimulation of DNA synthesis and the study of its regulation by terminal deoxynucleoside triphosphates (dNTPs) via a complex set of interactions. Computer experiments reveal that our model exhibits adequate and reasonable sensitivity in terms of dNTP pool levels and rates of DNA synthesis when inputs to the system are varied. These simulation experiments reveal that in order to achieve maximum DNA synthesis (in terms of purine metabolism), a proper balance is required in guanine and adenine input into this metabolic system. Excessive inputs will become inhibitory to DNA synthesis. In addition, studies are carried out on rates of DNA synthesis when various parameters are changed quantitatively. The current system is formulated by 110 differential equations.

  1. Symplectic molecular dynamics simulations on specially designed parallel computers.

    PubMed

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  2. Energy Efficiency Challenges of 5G Small Cell Networks.

    PubMed

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-05-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks.

  3. Energy Efficiency Challenges of 5G Small Cell Networks

    PubMed Central

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-01-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks. PMID:28757670

  4. Large eddy simulation in a rotary blood pump: Viscous shear stress computation and comparison with unsteady Reynolds-averaged Navier-Stokes simulation.

    PubMed

    Torner, Benjamin; Konnigk, Lucas; Hallier, Sebastian; Kumar, Jitendra; Witte, Matthias; Wurm, Frank-Hendrik

    2018-06-01

    Numerical flow analysis (computational fluid dynamics) in combination with the prediction of blood damage is an important procedure to investigate the hemocompatibility of a blood pump, since blood trauma due to shear stresses remains a problem in these devices. Today, the numerical damage prediction is conducted using unsteady Reynolds-averaged Navier-Stokes simulations. Investigations with large eddy simulations are rarely being performed for blood pumps. Hence, the aim of the study is to examine the viscous shear stresses of a large eddy simulation in a blood pump and compare the results with an unsteady Reynolds-averaged Navier-Stokes simulation. The simulations were carried out at two operation points of a blood pump. The flow was simulated on a 100M element mesh for the large eddy simulation and a 20M element mesh for the unsteady Reynolds-averaged Navier-Stokes simulation. As a first step, the large eddy simulation was verified by analyzing internal dissipative losses within the pump. Then, the pump characteristics and mean and turbulent viscous shear stresses were compared between the two simulation methods. The verification showed that the large eddy simulation is able to reproduce the significant portion of dissipative losses, which is a global indication that the equivalent viscous shear stresses are adequately resolved. The comparison with the unsteady Reynolds-averaged Navier-Stokes simulation revealed that the hydraulic parameters were in agreement, but differences for the shear stresses were found. The results show the potential of the large eddy simulation as a high-quality comparative case to check the suitability of a chosen Reynolds-averaged Navier-Stokes setup and turbulence model. Furthermore, the results lead to suggest that large eddy simulations are superior to unsteady Reynolds-averaged Navier-Stokes simulations when instantaneous stresses are applied for the blood damage prediction.

  5. State estimation of stochastic non-linear hybrid dynamic system using an interacting multiple model algorithm.

    PubMed

    Elenchezhiyan, M; Prakash, J

    2015-09-01

    In this work, state estimation schemes for non-linear hybrid dynamic systems subjected to stochastic state disturbances and random errors in measurements using interacting multiple-model (IMM) algorithms are formulated. In order to compute both discrete modes and continuous state estimates of a hybrid dynamic system either an IMM extended Kalman filter (IMM-EKF) or an IMM based derivative-free Kalman filters is proposed in this study. The efficacy of the proposed IMM based state estimation schemes is demonstrated by conducting Monte-Carlo simulation studies on the two-tank hybrid system and switched non-isothermal continuous stirred tank reactor system. Extensive simulation studies reveal that the proposed IMM based state estimation schemes are able to generate fairly accurate continuous state estimates and discrete modes. In the presence and absence of sensor bias, the simulation studies reveal that the proposed IMM unscented Kalman filter (IMM-UKF) based simultaneous state and parameter estimation scheme outperforms multiple-model UKF (MM-UKF) based simultaneous state and parameter estimation scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Natural gas content of geopressured aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randolph, Philip L.

    1977-01-01

    It is hypothesized that free, but immobile, natural gas is trapped in pores in geopressured aquifers and that this gas becomes mobile as aquifer pressure is reduced by water production. Computer simulation reveals this hypothesis is a plausible explanation for the high gas/water ratio observed from the No. 1 sand in the Edna Delcambre No. 1 well. In this Delcambre well test, the gas/water ratio increased from the solution gas value of less than 20 SCF/bbl to more than 50 SCF/bbl during production of 32,000 barrels of water in 10 days. Bottom hole pressure was reduced from 10,846 to 9,905more » psia. The computer simulation reveals that such increased gas production requires relative permeability to gas(k{sub rg}) increase from less than 10{sup -4} to about 10{sup -3} due to a decrease in fractional water saturation of pores (S{sub w}) of only about 0.001. Further, assuming drainage relative permeabilities are as calculated by the method of A.T. Corey{sup 1}, initial gas saturation of pores must be greater than 0.065. Means for achieving these initial conditions during geological time will be qualitatively discussed, and the effect of trapped gas upon long-term production will be described.« less

  7. A computational study of the topology of vortex breakdown

    NASA Technical Reports Server (NTRS)

    Spall, Robert E.; Gatski, Thomas B.

    1991-01-01

    A fully three-dimensional numerical simulation of vortex breakdown using the unsteady, incompressible Navier-Stokes equations has been performed. Solutions to four distinct types of breakdown are identified and compared with experimental results. The computed solutions include weak helical, double helix, spiral, and bubble-type breakdowns. The topological structure of the various breakdowns as well as their interrelationship are studied. The data reveal that the asymmetric modes of breakdown may be subject to additional breakdowns as the vortex core evolves in the streamwise direction. The solutions also show that the freestream axial velocity distribution has a significant effect on the position and type of vortex breakdown.

  8. Aspects of GPU perfomance in algorithms with random memory access

    NASA Astrophysics Data System (ADS)

    Kashkovsky, Alexander V.; Shershnev, Anton A.; Vashchenkov, Pavel V.

    2017-10-01

    The numerical code for solving the Boltzmann equation on the hybrid computational cluster using the Direct Simulation Monte Carlo (DSMC) method showed that on Tesla K40 accelerators computational performance drops dramatically with increase of percentage of occupied GPU memory. Testing revealed that memory access time increases tens of times after certain critical percentage of memory is occupied. Moreover, it seems to be the common problem of all NVidia's GPUs arising from its architecture. Few modifications of the numerical algorithm were suggested to overcome this problem. One of them, based on the splitting the memory into "virtual" blocks, resulted in 2.5 times speed up.

  9. Evaluation of synthetic linear motor-molecule actuation energetics

    PubMed Central

    Brough, Branden; Northrop, Brian H.; Schmidt, Jacob J.; Tseng, Hsian-Rong; Houk, Kendall N.; Stoddart, J. Fraser; Ho, Chih-Ming

    2006-01-01

    By applying atomic force microscope (AFM)-based force spectroscopy together with computational modeling in the form of molecular force-field simulations, we have determined quantitatively the actuation energetics of a synthetic motor-molecule. This multidisciplinary approach was performed on specifically designed, bistable, redox-controllable [2]rotaxanes to probe the steric and electrostatic interactions that dictate their mechanical switching at the single-molecule level. The fusion of experimental force spectroscopy and theoretical computational modeling has revealed that the repulsive electrostatic interaction, which is responsible for the molecular actuation, is as high as 65 kcal·mol−1, a result that is supported by ab initio calculations. PMID:16735470

  10. The effectiveness of using multimedia computer simulations coupled with social constructivist pedagogy in a college introductory physics classroom

    NASA Astrophysics Data System (ADS)

    Chou, Chiu-Hsiang

    Electricity and Magnetism is legendarily considered a subject incomprehensible to the students in the college introductory level. From a social constructivist perspective, learners are encouraged to assess the quantity and the quality of prior knowledge in a subject domain and to co-construct shared knowledge and understanding by implementing and building on each other's ideas. They become challenged by new data and perspectives thus stimulate a reconceptualization of knowledge and to be actively engaged in discovering new meanings based on experiences grounded in the real-world phenomena they are expected to learn. This process is categorized as a conceptual change learning environment and can facilitate learning of E & M. Computer simulations are an excellent tool to assist the teacher and leaner in achieving these goals and were used in this study. This study examined the effectiveness of computer simulations within a conceptual change learning environment and compared it to more lecture-centered, traditional ways of teaching E & M. An experimental and control group were compared and the following differences were observed. Statistic analyses were done with ANOVA (F-test). The results indicated that the treatment group significantly outperformed the control group on the achievement test, F(1,54) = 12.34, p <.05 and the treatment group had a higher rate of improvement than the control group on two subscales: Isolation of Variables and Abstract Transformation. The results from the Maryland Physics Expectations Survey (MPEX) showed that the treatment students became more field independent and were aware of more fundamental role played by physics concepts in complex problem solving. The protocol analysis of structured interviews revealed that students in the treatment group tended to visualize the problem from different aspects and articulated what they thought in a more scientific approach. Responses to the instructional evaluation questionnaire indicated overwhelming positive ratings of appropriateness and instructional effectiveness of computer simulation instruction. In conclusion, the CSI developed and evaluated in this study provided opportunities for students to refine their preconceptions and practice using new understandings. It suggests substantial promise for the computer simulation in a classroom environment.

  11. Mining data from CFD simulation for aneurysm and carotid bifurcation models.

    PubMed

    Miloš, Radović; Dejan, Petrović; Nenad, Filipović

    2011-01-01

    Arterial geometry variability is present both within and across individuals. To analyze the influence of geometric parameters, blood density, dynamic viscosity and blood velocity on wall shear stress (WSS) distribution in the human carotid artery bifurcation and aneurysm, the computer simulations were run to generate the data pertaining to this phenomenon. In our work we evaluate two prediction models for modeling these relationships: neural network model and k-nearest neighbor model. The results revealed that both models have high prediction ability for this prediction task. The achieved results represent progress in assessment of stroke risk for a given patient data in real time.

  12. Neuronal excitability level transition induced by electrical stimulation

    NASA Astrophysics Data System (ADS)

    Florence, G.; Kurths, J.; Machado, B. S.; Fonoff, E. T.; Cerdeira, H. A.; Teixeira, M. J.; Sameshima, K.

    2014-12-01

    In experimental studies, electrical stimulation (ES) has been applied to induce neuronal activity or to disrupt pathological patterns. Nevertheless, the underlying mechanisms of these activity pattern transitions are not clear. To study these phenomena, we simulated a model of the hippocampal region CA1. The computational simulations using different amplitude levels and duration of ES revealed three states of neuronal excitability: burst-firing mode, depolarization block and spreading depression wave. We used the bifurcation theory to analyse the interference of ES in the cellular excitability and the neuronal dynamics. Understanding this process would help to improve the ES techniques to control some neurological disorders.

  13. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  14. Computational Modeling of Inflammation and Wound Healing

    PubMed Central

    Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram

    2013-01-01

    Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362

  15. Site Identification by Ligand Competitive Saturation (SILCS) Simulations for Fragment-Based Drug Design

    PubMed Central

    Faller, Christina E.; Raman, E. Prabhu; MacKerell, Alexander D.; Guvench, Olgun

    2015-01-01

    Fragment-based drug design (FBDD) involves screening low molecular weight molecules (“fragments”) that correspond to functional groups found in larger drug-like molecules to determine their binding to target proteins or nucleic acids. Based on the principle of thermodynamic additivity, two fragments that bind non-overlapping nearby sites on the target can be combined to yield a new molecule whose binding free energy is the sum of those of the fragments. Experimental FBDD approaches, like NMR and X-ray crystallography, have proven very useful but can be expensive in terms of time, materials, and labor. Accordingly, a variety of computational FBDD approaches have been developed that provide different levels of detail and accuracy. The Site Identification by Ligand Competitive Saturation (SILCS) method of computational FBDD uses all-atom explicit-solvent molecular dynamics (MD) simulations to identify fragment binding. The target is “soaked” in an aqueous solution with multiple fragments having different identities. The resulting computational competition assay reveals what small molecule types are most likely to bind which regions of the target. From SILCS simulations, 3D probability maps of fragment binding called “FragMaps” can be produced. Based on the probabilities relative to bulk, SILCS FragMaps can be used to determine “Grid Free Energies (GFEs),” which provide per-atom contributions to fragment binding affinities. For essentially no additional computational overhead relative to the production of the FragMaps, GFEs can be used to compute Ligand Grid Free Energies (LGFEs) for arbitrarily complex molecules, and these LGFEs can be used to rank-order the molecules in accordance with binding affinities. PMID:25709034

  16. Student Ability, Confidence, and Attitudes Toward Incorporating a Computer into a Patient Interview.

    PubMed

    Ray, Sarah; Valdovinos, Katie

    2015-05-25

    To improve pharmacy students' ability to effectively incorporate a computer into a simulated patient encounter and to improve their awareness of barriers and attitudes towards and their confidence in using a computer during simulated patient encounters. Students completed a survey that assessed their awareness of, confidence in, and attitudes towards computer use during simulated patient encounters. Students were evaluated with a rubric on their ability to incorporate a computer into a simulated patient encounter. Students were resurveyed and reevaluated after instruction. Students improved in their ability to effectively incorporate computer usage into a simulated patient encounter. They also became more aware of and improved their attitudes toward barriers regarding such usage and gained more confidence in their ability to use a computer during simulated patient encounters. Instruction can improve pharmacy students' ability to incorporate a computer into simulated patient encounters. This skill is critical to developing efficiency while maintaining rapport with patients.

  17. A computational microscopy study of nanostructural evolution in irradiated pressure vessel steels

    NASA Astrophysics Data System (ADS)

    Odette, G. R.; Wirth, B. D.

    1997-11-01

    Nanostructural features that form in reactor pressure vessel steels under neutron irradiation at around 300°C lead to significant hardening and embrittlement. Continuum thermodynamic-kinetic based rate theories have been very successful in modeling the general characteristics of the copper and manganese nickel rich precipitate evolution, often the dominant source of embrittlement. However, a more detailed atomic scale understanding of these features is needed to interpret experimental measurements and better underpin predictive embrittlement models. Further, other embrittling features, believed to be subnanometer defect (vacancy)-solute complexes and small regions of modest enrichment of solutes are not well understood. A general approach to modeling embrittlement nanostructures, based on the concept of a computational microscope, is described. The objective of the computational microscope is to self-consistently integrate atomic scale simulations with other sources of information, including a wide range of experiments. In this work, lattice Monte Carlo (LMC) simulations are used to resolve the chemically and structurally complex nature of CuMnNiSi precipitates. The LMC simulations unify various nanoscale analytical characterization methods and basic thermodynamics. The LMC simulations also reveal that significant coupled vacancy and solute clustering takes place during cascade aging. The cascade clustering produces the metastable vacancy-cluster solute complexes that mediate flux effects. Cascade solute clustering may also play a role in the formation of dilute atmospheres of solute enrichment and enhance the nucleation of manganese-nickel rich precipitates at low Cu levels. Further, the simulations suggest that complex, highly correlated processes (e.g. cluster diffusion, formation of favored vacancy diffusion paths and solute scavenging vacancy cluster complexes) may lead to anomalous fast thermal aging kinetics at temperatures below about 450°C. The potential technical significance of these phenomena is described.

  18. Numerical Predictions of Mode Reflections in an Open Circular Duct: Comparison with Theory

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Hixon, Ray

    2015-01-01

    The NASA Broadband Aeroacoustic Stator Simulation code was used to compute the acoustic field for higher-order modes in a circular duct geometry. To test the accuracy of the results computed by the code, the duct was terminated by an open end with an infinite flange or no flange. Both open end conditions have a theoretical solution that was used to compare with the computed results. Excellent comparison for reflection matrix values was achieved after suitable refinement of the grid at the open end. The study also revealed issues with the level of the mode amplitude introduced into the acoustic held from the source boundary and the amount of reflection that occurred at the source boundary when a general nonreflecting boundary condition was applied.

  19. Monte Carlo simulations of quantum dot solar concentrators: ray tracing based on fluorescence mapping

    NASA Astrophysics Data System (ADS)

    Schuler, A.; Kostro, A.; Huriet, B.; Galande, C.; Scartezzini, J.-L.

    2008-08-01

    One promising application of semiconductor nanostructures in the field of photovoltaics might be quantum dot solar concentrators. Quantum dot containing nanocomposite thin films are synthesized at EPFL-LESO by a low cost sol-gel process. In order to study the potential of the novel planar photoluminescent concentrators, reliable computer simulations are needed. A computer code for ray tracing simulations of quantum dot solar concentrators has been developed at EPFL-LESO on the basis of Monte Carlo methods that are applied to polarization-dependent reflection/transmission at interfaces, photon absorption by the semiconductor nanocrystals and photoluminescent reemission. The software allows importing measured or theoretical absorption/reemission spectra describing the photoluminescent properties of the quantum dots. Hereby the properties of photoluminescent reemission are described by a set of emission spectra depending on the energy of the incoming photon, allowing to simulate the photoluminescent emission using the inverse function method. By our simulations, the importance of two main factors is revealed, an emission spectrum matched to the spectral efficiency curve of the photovoltaic cell, and a large Stokes shift, which is advantageous for the lateral energy transport. No significant energy losses are implied when the quantum dots are contained within a nanocomposite coating instead of being dispersed in the entire volume of the pane. Together with the knowledge on the optoelectronical properties of suitable photovoltaic cells, the simulations allow to predict the total efficiency of the envisaged concentrating PV systems, and to optimize photoluminescent emission frequencies, optical densities, and pane dimensions.

  20. Cosolvent-Based Molecular Dynamics for Ensemble Docking: Practical Method for Generating Druggable Protein Conformations.

    PubMed

    Uehara, Shota; Tanaka, Shigenori

    2017-04-24

    Protein flexibility is a major hurdle in current structure-based virtual screening (VS). In spite of the recent advances in high-performance computing, protein-ligand docking methods still demand tremendous computational cost to take into account the full degree of protein flexibility. In this context, ensemble docking has proven its utility and efficiency for VS studies, but it still needs a rational and efficient method to select and/or generate multiple protein conformations. Molecular dynamics (MD) simulations are useful to produce distinct protein conformations without abundant experimental structures. In this study, we present a novel strategy that makes use of cosolvent-based molecular dynamics (CMD) simulations for ensemble docking. By mixing small organic molecules into a solvent, CMD can stimulate dynamic protein motions and induce partial conformational changes of binding pocket residues appropriate for the binding of diverse ligands. The present method has been applied to six diverse target proteins and assessed by VS experiments using many actives and decoys of DEKOIS 2.0. The simulation results have revealed that the CMD is beneficial for ensemble docking. Utilizing cosolvent simulation allows the generation of druggable protein conformations, improving the VS performance compared with the use of a single experimental structure or ensemble docking by standard MD with pure water as the solvent.

  1. Molecular Dynamics based on a Generalized Born solvation model: application to protein folding

    NASA Astrophysics Data System (ADS)

    Onufriev, Alexey

    2004-03-01

    An accurate description of the aqueous environment is essential for realistic biomolecular simulations, but may become very expensive computationally. We have developed a version of the Generalized Born model suitable for describing large conformational changes in macromolecules. The model represents the solvent implicitly as continuum with the dielectric properties of water, and include charge screening effects of salt. The computational cost associated with the use of this model in Molecular Dynamics simulations is generally considerably smaller than the cost of representing water explicitly. Also, compared to traditional Molecular Dynamics simulations based on explicit water representation, conformational changes occur much faster in implicit solvation environment due to the absence of viscosity. The combined speed-up allow one to probe conformational changes that occur on much longer effective time-scales. We apply the model to folding of a 46-residue three helix bundle protein (residues 10-55 of protein A, PDB ID 1BDD). Starting from an unfolded structure at 450 K, the protein folds to the lowest energy state in 6 ns of simulation time, which takes about a day on a 16 processor SGI machine. The predicted structure differs from the native one by 2.4 A (backbone RMSD). Analysis of the structures seen on the folding pathway reveals details of the folding process unavailable form experiment.

  2. Molecular dynamics simulations on the inhibition of cyclin-dependent kinases 2 and 5 in the presence of activators.

    PubMed

    Zhang, Bing; Tan, Vincent B C; Lim, Kian Meng; Tay, Tong Earn

    2006-06-01

    Interests in CDK2 and CDK5 have stemmed mainly from their association with cancer and neuronal migration or differentiation related diseases and the need to design selective inhibitors for these kinases. Molecular dynamics (MD) simulations have not only become a viable approach to drug design because of advances in computer technology but are increasingly an integral part of drug discovery processes. It is common in MD simulations of inhibitor/CDK complexes to exclude the activator of the CDKs in the structural models to keep computational time tractable. In this paper, we present simulation results of CDK2 and CDK5 with roscovitine using models with and without their activators (cyclinA and p25). While p25 was found to induce slight changes in CDK5, the calculations support that cyclinA leads to significant conformational changes near the active site of CDK2. This suggests that detailed and structure-based inhibitor design targeted at these CDKs should employ activator-included models of the kinases. Comparisons between P/CDK2/cyclinA/roscovitine and CDK5/p25/roscovitine complexes reveal differences in the conformations of the glutamine around the active sites, which may be exploited to find highly selective inhibitors with respect to CDK2 and CDK5.

  3. Influence of Thermal Cycling on Flexural Properties and Simulated Wear of Computer-aided Design/Computer-aided Manufacturing Resin Composites.

    PubMed

    Tsujimoto, A; Barkmeier, W W; Takamizawa, T; Latta, M A; Miyazaki, M

    The purpose of this study was to evaluate the influence of thermal cycling on the flexural properties and simulated wear of computer-aided design/computer-aided manufacturing (CAD/CAM) resin composites. The six CAD/CAM resin composites used in this study were 1) Lava Ultimate CAD/CAM Restorative (LU); 2) Paradigm MZ100 (PM); 3) CERASMART (CS); 4) Shofu Block HC (SB); 5) KATANA AVENCIA Block (KA); and 6) VITA ENAMIC (VE). Specimens were divided randomly into two groups, one of which was stored in distilled water for 24 hours, and the other of which was subjected to 10,000 thermal cycles. For each material, 15 specimens from each group were used to determine the flexural strength and modulus according to ISO 6872, and 20 specimens from each group were used to examine wear using a localized wear simulation model. The test materials were subjected to a wear challenge of 400,000 cycles in a Leinfelder-Suzuki device (Alabama machine). The materials were placed in custom-cylinder stainless steel fixtures, and simulated localized wear was generated using a stainless steel ball bearing (r=2.387 mm) antagonist in a water slurry of polymethyl methacrylate beads. Simulated wear was determined using a noncontact profilometer (Proscan 2100) with Proscan and AnSur 3D software. The two-way analysis of variance of flexural properties and simulated wear of CAD/CAM resin composites revealed that material type and thermal cycling had a significant influence (p<0.05), but there was no significant interaction (p>0.05) between the two factors. The flexural properties and maximum depth of wear facets of CAD/CAM resin composite were different (p<0.05) depending on the material, and their values were influenced (p>0.05) by thermal cycling, except in the case of VE. The volume losses in wear facets on LU, PM, and SB after 10,000 thermal cycles were significantly higher (p<0.05) than those after 24 hours of water storage, unlike CS, KA, and VE. The results of this study indicate that the flexural properties and simulated wear of CAD/CAM resin composites are different depending on the material. In addition, the flexural properties and simulated wear of CAD/CAM resin composites are influenced by thermal cycling.

  4. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  5. Simulations of Bluff Body Flow Interaction for Noise Source Modeling

    NASA Technical Reports Server (NTRS)

    Khorrami, Medi R.; Lockard David P.; Choudhari, Meelan M.; Jenkins, Luther N.; Neuhart, Dan H.; McGinley, Catherine B.

    2006-01-01

    The current study is a continuation of our effort to characterize the details of flow interaction between two cylinders in a tandem configuration. This configuration is viewed to possess many of the pertinent flow features of the highly interactive unsteady flow field associated with the main landing gear of large civil transports. The present effort extends our previous two-dimensional, unsteady, Reynolds Averaged Navier-Stokes computations to three dimensions using a quasilaminar, zonal approach, in conjunction with a two-equation turbulence model. Two distinct separation length-to-diameter ratios of L/D = 3.7 and 1.435, representing intermediate and short separation distances between the two cylinders, are simulated. The Mach 0.166 simulations are performed at a Reynolds number of Re = 1.66 105 to match the companion experiments at NASA Langley Research Center. Extensive comparisons with the measured steady and unsteady surface pressure and off-surface particle image velocimetry data show encouraging agreement. Both prominent and some of the more subtle trends in the mean and fluctuating flow fields are correctly predicted. Both computations and the measured data reveal a more robust and energetic shedding process at L/D = 3.7 in comparison with the weaker shedding in the shorter separation case of L/D = 1.435. The vortex shedding frequency based on the computed surface pressure spectra is in reasonable agreement with the measured Strouhal frequency.

  6. Elucidating Ligand-Modulated Conformational Landscape of GPCRs Using Cloud-Computing Approaches.

    PubMed

    Shukla, Diwakar; Lawrenz, Morgan; Pande, Vijay S

    2015-01-01

    G-protein-coupled receptors (GPCRs) are a versatile family of membrane-bound signaling proteins. Despite the recent successes in obtaining crystal structures of GPCRs, much needs to be learned about the conformational changes associated with their activation. Furthermore, the mechanism by which ligands modulate the activation of GPCRs has remained elusive. Molecular simulations provide a way of obtaining detailed an atomistic description of GPCR activation dynamics. However, simulating GPCR activation is challenging due to the long timescales involved and the associated challenge of gaining insights from the "Big" simulation datasets. Here, we demonstrate how cloud-computing approaches have been used to tackle these challenges and obtain insights into the activation mechanism of GPCRs. In particular, we review the use of Markov state model (MSM)-based sampling algorithms for sampling milliseconds of dynamics of a major drug target, the G-protein-coupled receptor β2-AR. MSMs of agonist and inverse agonist-bound β2-AR reveal multiple activation pathways and how ligands function via modulation of the ensemble of activation pathways. We target this ensemble of conformations with computer-aided drug design approaches, with the goal of designing drugs that interact more closely with diverse receptor states, for overall increased efficacy and specificity. We conclude by discussing how cloud-based approaches present a powerful and broadly available tool for studying the complex biological systems routinely. © 2015 Elsevier Inc. All rights reserved.

  7. Dynamic simulations of many-body electrostatic self-assembly

    NASA Astrophysics Data System (ADS)

    Lindgren, Eric B.; Stamm, Benjamin; Maday, Yvon; Besley, Elena; Stace, A. J.

    2018-03-01

    Two experimental studies relating to electrostatic self-assembly have been the subject of dynamic computer simulations, where the consequences of changing the charge and the dielectric constant of the materials concerned have been explored. One series of calculations relates to experiments on the assembly of polymer particles that have been subjected to tribocharging and the simulations successfully reproduce many of the observed patterns of behaviour. A second study explores events observed following collisions between single particles and small clusters composed of charged particles derived from a metal oxide composite. As before, observations recorded during the course of the experiments are reproduced by the calculations. One study in particular reveals how particle polarizability can influence the assembly process. This article is part of the theme issue `Modern theoretical chemistry'.

  8. Large-Eddy Simulation of Turbulent Wall-Pressure Fluctuations

    NASA Technical Reports Server (NTRS)

    Singer, Bart A.

    1996-01-01

    Large-eddy simulations of a turbulent boundary layer with Reynolds number based on displacement thickness equal to 3500 were performed with two grid resolutions. The computations were continued for sufficient time to obtain frequency spectra with resolved frequencies that correspond to the most important structural frequencies on an aircraft fuselage. The turbulent stresses were adequately resolved with both resolutions. Detailed quantitative analysis of a variety of statistical quantities associated with the wall-pressure fluctuations revealed similar behavior for both simulations. The primary differences were associated with the lack of resolution of the high-frequency data in the coarse-grid calculation and the increased jitter (due to the lack of multiple realizations for averaging purposes) in the fine-grid calculation. A new curve fit was introduced to represent the spanwise coherence of the cross-spectral density.

  9. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  10. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  11. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  12. Atomic bonding effects in annular dark field scanning transmission electron microscopy. I. Computational predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odlyzko, Michael L.; Mkhoyan, K. Andre, E-mail: mkhoyan@umn.edu; Himmetoglu, Burak

    2016-07-15

    Annular dark field scanning transmission electron microscopy (ADF-STEM) image simulations were performed for zone-axis-oriented light-element single crystals, using a multislice method adapted to include charge redistribution due to chemical bonding. Examination of these image simulations alongside calculations of the propagation of the focused electron probe reveal that the evolution of the probe intensity with thickness exhibits significant sensitivity to interatomic charge transfer, accounting for observed thickness-dependent bonding sensitivity of contrast in all ADF-STEM imaging conditions. Because changes in image contrast relative to conventional neutral atom simulations scale directly with the net interatomic charge transfer, the strongest effects are seen inmore » crystals with highly polar bonding, while no effects are seen for nonpolar bonding. Although the bonding dependence of ADF-STEM image contrast varies with detector geometry, imaging parameters, and material temperature, these simulations predict the bonding effects to be experimentally measureable.« less

  13. Retinal Image Simulation of Subjective Refraction Techniques

    PubMed Central

    Perches, Sara; Collados, M. Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient’s response-guided refraction) is the most commonly used approach. In this context, this paper’s main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques—including Jackson’s Cross-Cylinder test (JCC)—relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software’s usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training. PMID:26938648

  14. A resolved two-way coupled CFD/6-DOF approach for predicting embolus transport and the embolus-trapping efficiency of IVC filters.

    PubMed

    Aycock, Kenneth I; Campbell, Robert L; Manning, Keefe B; Craven, Brent A

    2017-06-01

    Inferior vena cava (IVC) filters are medical devices designed to provide a mechanical barrier to the passage of emboli from the deep veins of the legs to the heart and lungs. Despite decades of development and clinical use, IVC filters still fail to prevent the passage of all hazardous emboli. The objective of this study is to (1) develop a resolved two-way computational model of embolus transport, (2) provide verification and validation evidence for the model, and (3) demonstrate the ability of the model to predict the embolus-trapping efficiency of an IVC filter. Our model couples computational fluid dynamics simulations of blood flow to six-degree-of-freedom simulations of embolus transport and resolves the interactions between rigid, spherical emboli and the blood flow using an immersed boundary method. Following model development and numerical verification and validation of the computational approach against benchmark data from the literature, embolus transport simulations are performed in an idealized IVC geometry. Centered and tilted filter orientations are considered using a nonlinear finite element-based virtual filter placement procedure. A total of 2048 coupled CFD/6-DOF simulations are performed to predict the embolus-trapping statistics of the filter. The simulations predict that the embolus-trapping efficiency of the IVC filter increases with increasing embolus diameter and increasing embolus-to-blood density ratio. Tilted filter placement is found to decrease the embolus-trapping efficiency compared with centered filter placement. Multiple embolus-trapping locations are predicted for the IVC filter, and the trapping locations are predicted to shift upstream and toward the vessel wall with increasing embolus diameter. Simulations of the injection of successive emboli into the IVC are also performed and reveal that the embolus-trapping efficiency decreases with increasing thrombus load in the IVC filter. In future work, the computational tool could be used to investigate IVC filter design improvements, the effect of patient anatomy on embolus transport and IVC filter embolus-trapping efficiency, and, with further development and validation, optimal filter selection and placement on a patient-specific basis.

  15. Exploratory benchtop study evaluating the use of surgical design and simulation in fibula free flap mandibular reconstruction

    PubMed Central

    2013-01-01

    Background Surgical design and simulation (SDS) is a useful tool to help surgeons visualize the anatomy of the patient and perform operative maneuvers on the computer before implementation in the operating room. While these technologies have many advantages, further evidence of their potential to improve outcomes is required. The present benchtop study was intended to identify if there is a difference in surgical outcome between free-hand surgery completed without virtual surgical planning (VSP) software and preoperatively planned surgery completed with the use of VSP software. Methods Five surgeons participated in the study. In Session A, participants were asked to do a free-hand reconstruction of a 3d printed mandible with a defect using a 3d printed fibula. Four weeks later, in Session B, the participants were asked to do the same reconstruction, but in this case using a preoperatively digitally designed surgical plan. Digital registration computer software, hard tissue measures and duration of the task were used to compare the outcome of the benchtop reconstructions. Results The study revealed that: (1) superimposed images produced in a computer aided design (CAD) software were effective in comparing pre and post-surgical outcomes, (2) there was a difference, based on hard tissue measures, in surgical outcome between the two scenarios and (3) there was no difference in the time it took to complete the sessions. Conclusion The study revealed that the participants were more consistent in the preoperatively digitally planned surgery than they were in the free hand surgery. PMID:23800209

  16. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  17. The Simultaneous Production Model; A Model for the Construction, Testing, Implementation and Revision of Educational Computer Simulation Environments.

    ERIC Educational Resources Information Center

    Zillesen, Pieter G. van Schaick

    This paper introduces a hardware and software independent model for producing educational computer simulation environments. The model, which is based on the results of 32 studies of educational computer simulations program production, implies that educational computer simulation environments are specified, constructed, tested, implemented, and…

  18. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  19. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    NASA Astrophysics Data System (ADS)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  20. Simulations Meet Experiment to Reveal New Insights into DNA Intrinsic Mechanics

    PubMed Central

    Ben Imeddourene, Akli; Elbahnsi, Ahmad; Guéroult, Marc; Oguey, Christophe; Foloppe, Nicolas; Hartmann, Brigitte

    2015-01-01

    The accurate prediction of the structure and dynamics of DNA remains a major challenge in computational biology due to the dearth of precise experimental information on DNA free in solution and limitations in the DNA force-fields underpinning the simulations. A new generation of force-fields has been developed to better represent the sequence-dependent B-DNA intrinsic mechanics, in particular with respect to the BI ↔ BII backbone equilibrium, which is essential to understand the B-DNA properties. Here, the performance of MD simulations with the newly updated force-fields Parmbsc0εζOLI and CHARMM36 was tested against a large ensemble of recent NMR data collected on four DNA dodecamers involved in nucleosome positioning. We find impressive progress towards a coherent, realistic representation of B-DNA in solution, despite residual shortcomings. This improved representation allows new and deeper interpretation of the experimental observables, including regarding the behavior of facing phosphate groups in complementary dinucleotides, and their modulation by the sequence. It also provides the opportunity to extensively revisit and refine the coupling between backbone states and inter base pair parameters, which emerges as a common theme across all the complementary dinucleotides. In sum, the global agreement between simulations and experiment reveals new aspects of intrinsic DNA mechanics, a key component of DNA-protein recognition. PMID:26657165

  1. A comparative study on real lab and simulation lab in communication engineering from students' perspectives

    NASA Astrophysics Data System (ADS)

    Balakrishnan, B.; Woods, P. C.

    2013-05-01

    Over the years, rapid development in computer technology has engendered simulation-based laboratory (lab) in addition to the traditional hands-on (physical) lab. Many higher education institutions adopt simulation lab, replacing some existing physical lab experiments. The creation of new systems for conducting engineering lab activities has raised concerns among educators on the merits and shortcomings of both physical and simulation labs; at the same time, many arguments have been raised on the differences of both labs. Investigating the effectiveness of both labs is complicated, as there are multiple factors that should be considered. In view of this challenge, a study on students' perspectives on their experience related to key aspects on engineering laboratory exercise was conducted. In this study, the Visual Auditory Read and Kinetic model was utilised to measure the students' cognitive styles. The investigation was done through a survey among participants from Multimedia University, Malaysia. The findings revealed that there are significant differences for most of the aspects in physical and simulation labs.

  2. Survey of computer programs for prediction of crash response and of its experimental validation

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.

    1976-01-01

    The author seeks to critically assess the potentialities of the mathematical and hybrid simulators which predict post-impact response of transportation vehicles. A strict rigorous numerical analysis of a complex phenomenon like crash may leave a lot to be desired with regard to the fidelity of mathematical simulation. Hybrid simulations on the other hand which exploit experimentally observed features of deformations appear to hold a lot of promise. MARC, ANSYS, NONSAP, DYCAST, ACTION, WHAM II and KRASH are among some of the simulators examined for their capabilities with regard to prediction of post impact response of vehicles. A review of these simulators reveals that much more by way of an analysis capability may be desirable than what is currently available. NASA's crashworthiness testing program in conjunction with similar programs of various other agencies, besides generating a large data base, will be equally useful in the validation of new mathematical concepts of nonlinear analysis and in the successful extension of other techniques in crashworthiness.

  3. A 2D Array of 100's of Ions for Quantum Simulation and Many-Body Physics in a Penning Trap

    NASA Astrophysics Data System (ADS)

    Bohnet, Justin; Sawyer, Brian; Britton, Joseph; Bollinger, John

    2015-05-01

    Quantum simulations promise to reveal new materials and phenomena for experimental study, but few systems have demonstrated the capability to control ensembles in which quantum effects cannot be directly computed. One possible platform for intractable quantum simulations may be a system of 100's of 9Be+ ions in a Penning trap, where the valence electron spins are coupled with an effective Ising interaction in a 2D geometry. Here we report on results from a new Penning trap designed for 2D quantum simulations. We characterize the ion crystal stability and describe progress towards bench-marking quantum effects of the spin-spin coupling using a spin-squeezing witness. We also report on the successful photodissociation of BeH+ contaminant molecular ions that impede the use of such crystals for quantum simulation. This work lays the foundation for future experiments such as the observation of spin dynamics under the quantum Ising Hamiltonian with a transverse field. Supported by a NIST-NRC Research Associateship.

  4. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    PubMed

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to use high-fidelity patient simulators, which present simulations that are closer to real-life situations.

  5. Computational Models Reveal a Passive Mechanism for Cell Migration in the Crypt

    PubMed Central

    Dunn, Sara-Jane; Näthke, Inke S.; Osborne, James M.

    2013-01-01

    Cell migration in the intestinal crypt is essential for the regular renewal of the epithelium, and the continued upward movement of cells is a key characteristic of healthy crypt dynamics. However, the driving force behind this migration is unknown. Possibilities include mitotic pressure, active movement driven by motility cues, or negative pressure arising from cell loss at the crypt collar. It is possible that a combination of factors together coordinate migration. Here, three different computational models are used to provide insight into the mechanisms that underpin cell movement in the crypt, by examining the consequence of eliminating cell division on cell movement. Computational simulations agree with existing experimental results, confirming that migration can continue in the absence of mitosis. Importantly, however, simulations allow us to infer mechanisms that are sufficient to generate cell movement, which is not possible through experimental observation alone. The results produced by the three models agree and suggest that cell loss due to apoptosis and extrusion at the crypt collar relieves cell compression below, allowing cells to expand and move upwards. This finding suggests that future experiments should focus on the role of apoptosis and cell extrusion in controlling cell migration in the crypt. PMID:24260407

  6. Simulation of MST tokamak discharges with resonant magnetic perturbations

    NASA Astrophysics Data System (ADS)

    Cornille, B. S.; Sovinec, C. R.; Chapman, B. E.; Dubois, A.; McCollam, K. J.; Munaretto, S.

    2016-10-01

    Nonlinear MHD modeling of MST tokamak plasmas with an applied resonant magnetic perturbation (RMP) reveals degradation of flux surfaces that may account for the experimentally observed suppression of runaway electrons with the RMP. Runaway electrons are routinely generated in MST tokamak discharges with low plasma density. When an m = 3 RMP is applied these electrons are strongly suppressed, while an m = 1 RMP of comparable amplitude has little effect. The computations are performed using the NIMROD code and use reconstructed equilibrium states of MST tokamak plasmas with q (0) < 1 and q (a) = 2.2 . Linear computations show that the (1 , 1) -kink and (2 , 2) -tearing modes are unstable, and nonlinear simulations produce sawtoothing with a period of approximately 0.5 ms, which is comparable to the period of MHD activity observed experimentally. Adding an m = 3 RMP in the computation degrades flux surfaces in the outer region of the plasma, while no degradation occurs with an m = 1 RMP. The outer flux surface degradation with the m = 3 RMP, combined with the sawtooth-induced distortion of flux surfaces in the core, may account for the observed suppression of runaway electrons. Work supported by DOE Grant DE-FC02-08ER54975.

  7. Exact and efficient simulation of concordant computation

    NASA Astrophysics Data System (ADS)

    Cable, Hugo; Browne, Daniel E.

    2015-11-01

    Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.

  8. Quantum annealing versus classical machine learning applied to a simplified computational biology problem

    NASA Astrophysics Data System (ADS)

    Li, Richard Y.; Di Felice, Rosa; Rohs, Remo; Lidar, Daniel A.

    2018-03-01

    Transcription factors regulate gene expression, but how these proteins recognize and specifically bind to their DNA targets is still debated. Machine learning models are effective means to reveal interaction mechanisms. Here we studied the ability of a quantum machine learning approach to classify and rank binding affinities. Using simplified data sets of a small number of DNA sequences derived from actual binding affinity experiments, we trained a commercially available quantum annealer to classify and rank transcription factor binding. The results were compared to state-of-the-art classical approaches for the same simplified data sets, including simulated annealing, simulated quantum annealing, multiple linear regression, LASSO, and extreme gradient boosting. Despite technological limitations, we find a slight advantage in classification performance and nearly equal ranking performance using the quantum annealer for these fairly small training data sets. Thus, we propose that quantum annealing might be an effective method to implement machine learning for certain computational biology problems.

  9. Large eddy simulation of fine water sprays: comparative analysis of two models and computer codes

    NASA Astrophysics Data System (ADS)

    Tsoy, A. S.; Snegirev, A. Yu.

    2015-09-01

    The model and the computer code FDS, albeit widely used in engineering practice to predict fire development, is not sufficiently validated for fire suppression by fine water sprays. In this work, the effect of numerical resolution of the large scale turbulent pulsations on the accuracy of predicted time-averaged spray parameters is evaluated. Comparison of the simulation results obtained with the two versions of the model and code, as well as that of the predicted and measured radial distributions of the liquid flow rate revealed the need to apply monotonic and yet sufficiently accurate discrete approximations of the convective terms. Failure to do so delays jet break-up, otherwise induced by large turbulent eddies, thereby excessively focuses the predicted flow around its axis. The effect of the pressure drop in the spray nozzle is also examined, and its increase has shown to cause only weak increase of the evaporated fraction and vapor concentration despite the significant increase of flow velocity.

  10. Delayed fission and multifragmentation in sub-keV C60 - Au(0 0 1) collisions via molecular dynamics simulations: Mass distributions and activated statistical decay

    NASA Astrophysics Data System (ADS)

    Bernstein, V.; Kolodney, E.

    2017-10-01

    We have recently observed, both experimentally and computationally, the phenomenon of postcollision multifragmentation in sub-keV surface collisions of a C60 projectile. Namely, delayed multiparticle breakup of a strongly impact deformed and vibrationally excited large cluster collider into several large fragments, after leaving the surface. Molecular dynamics simulations with extensive statistics revealed a nearly simultaneous event, within a sub-psec time window. Here we study, computationally, additional essential aspects of this new delayed collisional fragmentation which were not addressed before. Specifically, we study here the delayed (binary) fission channel for different impact energies both by calculating mass distributions over all fission events and by calculating and analyzing lifetime distributions of the scattered projectile. We observe an asymmetric fission resulting in a most probable fission channel and we find an activated exponential (statistical) decay. Finally, we also calculate and discuss the fragment mass distribution in (triple) multifragmentation over different time windows, in terms of most abundant fragments.

  11. Analysis of audiometric notch as a noise-induced hearing loss phenotype in US youth: data from the National Health And Nutrition Examination Survey, 2005-2010.

    PubMed

    Bhatt, Ishan S; Guthrie, O'neil

    2017-06-01

    Bilateral audiometric notch (BN) at 4000-6000 Hz was identified as a noise-induced hearing loss (NIHL) phenotype for genetic association analysis in college-aged musicians. This study analysed BN in a sample of US youth. Prevalence of the BN within the study sample was determined and logistic-regression analyses were performed to identify audiologic and other demographic factors associated with BN. Computer-simulated "flat" audiograms were used to estimate potential influence of false-positive rates in estimating the prevalence of the BN. 2348 participants (12-19 years) following the inclusion criteria were selected from the National Health and Nutrition Examination Survey data (2005-2010). The prevalence of BN was 16.6%. Almost 55.6% of the participants showed notch in at least one ear. Noise exposure, gender, ethnicity and age showed significant relationship with the BN. Computer simulation revealed that 5.5% of simulated participants with "flat" audiograms showed BN. Association of noise exposure with BN suggests that it is a useful NIHL phenotype for genetic association analyses. However, further research is necessary to reduce false-positive rates in notch identification.

  12. Multiscale computational models in physical systems biology of intracellular trafficking.

    PubMed

    Tourdot, Richard W; Bradley, Ryan P; Ramakrishnan, Natesan; Radhakrishnan, Ravi

    2014-10-01

    In intracellular trafficking, a definitive understanding of the interplay between protein binding and membrane morphology remains incomplete. The authors describe a computational approach by integrating coarse-grained molecular dynamics (CGMD) simulations with continuum Monte Carlo (CM) simulations of the membrane to study protein-membrane interactions and the ensuing membrane curvature. They relate the curvature field strength discerned from the molecular level to its effect at the cellular length-scale. They perform thermodynamic integration on the CM model to describe the free energy landscape of vesiculation in clathrin-mediated endocytosis. The method presented here delineates membrane morphologies and maps out the free energy changes associated with membrane remodeling due to varying coat sizes, coat curvature strengths, membrane bending rigidities, and tensions; furthermore several constraints on mechanisms underlying clathrin-mediated endocytosis have also been identified, Their CGMD simulations have revealed the importance of PIP2 for stable binding of proteins essential for curvature induction in the bilayer and have provided a molecular basis for the positive curvature induction by the epsin N-terminal homology (EIMTH) domain. Calculation of the free energy landscape for vesicle budding has identified the critical size and curvature strength of a clathrin coat required for nucleation and stabilisation of a mature vesicle.

  13. The effect of oxaloacetic acid on tyrosinase activity and structure: Integration of inhibition kinetics with docking simulation.

    PubMed

    Gou, Lin; Lee, Jinhyuk; Hao, Hao; Park, Yong-Doo; Zhan, Yi; Lü, Zhi-Rong

    2017-08-01

    Oxaloacetic acid (OA) is naturally found in organisms and well known as an intermediate of citric acid cycle producing ATP. We evaluated the effects of OA on tyrosinase activity and structure via integrating methods of enzyme kinetics and computational simulations. OA was found to be a reversible inhibitor of tyrosinase and its induced mechanism was the parabolic non-competitive inhibition type (IC 50 =17.5±0.5mM and K i =6.03±1.36mM). Kinetic measurements by real-time interval assay showed that OA induced multi-phasic inactivation process composing with fast (k 1 ) and slow (k 2 ) phases. Spectrofluorimetry studies showed that OA mainly induced regional changes in the active site of tyrosinase accompanying with hydrophobic disruption at high dose. The computational docking simulations further revealed that OA could interact with several residues near the tyrosinase active site pocket such as HIS61, HIS259, HIS263, and VAL283. Our study provides insight into the mechanism by which energy producing intermediate such as OA inhibit tyrosinase and OA is a potential natural anti-pigmentation agent. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Computational hydrodynamics and optical performance of inductively-coupled plasma adaptive lenses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mortazavi, M.; Urzay, J., E-mail: jurzay@stanford.edu; Mani, A.

    2015-06-15

    This study addresses the optical performance of a plasma adaptive lens for aero-optical applications by using both axisymmetric and three-dimensional numerical simulations. Plasma adaptive lenses are based on the effects of free electrons on the phase velocity of incident light, which, in theory, can be used as a phase-conjugation mechanism. A closed cylindrical chamber filled with Argon plasma is used as a model lens into which a beam of light is launched. The plasma is sustained by applying a radio-frequency electric current through a coil that envelops the chamber. Four different operating conditions, ranging from low to high powers andmore » induction frequencies, are employed in the simulations. The numerical simulations reveal complex hydrodynamic phenomena related to buoyant and electromagnetic laminar transport, which generate, respectively, large recirculating cells and wall-normal compression stresses in the form of local stagnation-point flows. In the axisymmetric simulations, the plasma motion is coupled with near-wall axial striations in the electron-density field, some of which propagate in the form of low-frequency traveling disturbances adjacent to vortical quadrupoles that are reminiscent of Taylor-Görtler flow structures in centrifugally unstable flows. Although the refractive-index fields obtained from axisymmetric simulations lead to smooth beam wavefronts, they are found to be unstable to azimuthal disturbances in three of the four three-dimensional cases considered. The azimuthal striations are optically detrimental, since they produce high-order angular aberrations that account for most of the beam wavefront error. A fourth case is computed at high input power and high induction frequency, which displays the best optical properties among all the three-dimensional simulations considered. In particular, the increase in induction frequency prevents local thermalization and leads to an axisymmetric distribution of electrons even after introduction of spatial disturbances. The results highlight the importance of accounting for spatial effects in the numerical computations when optical analyses of plasma lenses are pursued in this range of operating conditions.« less

  15. Matching the reaction-diffusion simulation to dynamic [18F]FMISO PET measurements in tumors: extension to a flow-limited oxygen-dependent model.

    PubMed

    Shi, Kuangyu; Bayer, Christine; Gaertner, Florian C; Astner, Sabrina T; Wilkens, Jan J; Nüsslin, Fridtjof; Vaupel, Peter; Ziegler, Sibylle I

    2017-02-01

    Positron-emission tomography (PET) with hypoxia specific tracers provides a noninvasive method to assess the tumor oxygenation status. Reaction-diffusion models have advantages in revealing the quantitative relation between in vivo imaging and the tumor microenvironment. However, there is no quantitative comparison of the simulation results with the real PET measurements yet. The lack of experimental support hampers further applications of computational simulation models. This study aims to compare the simulation results with a preclinical [ 18 F]FMISO PET study and to optimize the reaction-diffusion model accordingly. Nude mice with xenografted human squamous cell carcinomas (CAL33) were investigated with a 2 h dynamic [ 18 F]FMISO PET followed by immunofluorescence staining using the hypoxia marker pimonidazole and the endothelium marker CD 31. A large data pool of tumor time-activity curves (TAC) was simulated for each mouse by feeding the arterial input function (AIF) extracted from experiments into the model with different configurations of the tumor microenvironment. A measured TAC was considered to match a simulated TAC when the difference metric was below a certain, noise-dependent threshold. As an extension to the well-established Kelly model, a flow-limited oxygen-dependent (FLOD) model was developed to improve the matching between measurements and simulations. The matching rate between the simulated TACs of the Kelly model and the mouse PET data ranged from 0 to 28.1% (on average 9.8%). By modifying the Kelly model to an FLOD model, the matching rate between the simulation and the PET measurements could be improved to 41.2-84.8% (on average 64.4%). Using a simulation data pool and a matching strategy, we were able to compare the simulated temporal course of dynamic PET with in vivo measurements. By modifying the Kelly model to a FLOD model, the computational simulation was able to approach the dynamic [ 18 F]FMISO measurements in the investigated tumors.

  16. Conjugate Heat Transfer Analyses on the Manifold for Ramjet Fuel Injectors

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen J.

    2006-01-01

    Three-dimensional conjugate heat transfer analyses on the manifold located upstream of the ramjet fuel injector are performed using CFdesign, a finite-element computational fluid dynamics (CFD) software. The flow field of the hot fuel (JP-7) flowing through the manifold is simulated and the wall temperature of the manifold is computed. The three-dimensional numerical results of the fuel temperature are compared with those obtained using a one-dimensional analysis based on empirical equations, and they showed a good agreement. The numerical results revealed that it takes around 30 to 40 sec to reach the equilibrium where the fuel temperature has dropped about 3 F from the inlet to the exit of the manifold.

  17. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines.

    PubMed

    Ma, Ping; Lien, Fue-Sang; Yee, Eugene

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz.

  18. Real-time simulation of the TF30-P-3 turbofan engine using a hybrid computer

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Bruton, W. M.

    1974-01-01

    A real-time, hybrid-computer simulation of the TF30-P-3 turbofan engine was developed. The simulation was primarily analog in nature but used the digital portion of the hybrid computer to perform bivariate function generation associated with the performance of the engine's rotating components. FORTRAN listings and analog patching diagrams are provided. The hybrid simulation was controlled by a digital computer programmed to simulate the engine's standard hydromechanical control. Both steady-state and dynamic data obtained from the digitally controlled engine simulation are presented. Hybrid simulation data are compared with data obtained from a digital simulation provided by the engine manufacturer. The comparisons indicate that the real-time hybrid simulation adequately matches the baseline digital simulation.

  19. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  20. Double-negative metamaterial for mobile phone application

    NASA Astrophysics Data System (ADS)

    Hossain, M. I.; Faruque, M. R. I.; Islam, M. T.

    2017-01-01

    In this paper, a new design and analysis of metamaterial and its applications to modern handset are presented. The proposed metamaterial unit-cell design consists of two connected square spiral structures, which leads to increase the effective media ratio. The finite instigation technique based on Computer Simulation Technology Microwave Studio is utilized in this investigation, and the measurement is taken in an anechoic chamber. A good agreement is observed among simulated and measured results. The results indicate that the proposed metamaterial can successfully cover cellular phone frequency bands. Moreover, the uses of proposed metamaterial in modern handset antennas are also analyzed. The results reveal that the proposed metamaterial attachment significantly reduces specific absorption rate values without reducing the antenna performances.

  1. Simulated annealing in networks for computing possible arrangements for red and green cones

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1987-01-01

    Attention is given to network models in which each of the cones of the retina is given a provisional color at random, and then the cones are allowed to determine the colors of their neighbors through an iterative process. A symmetric-structure spin-glass model has allowed arrays to be generated from completely random arrangements of red and green to arrays with approximately as much disorder as the parafoveal cones. Simulated annealing has also been added to the process in an attempt to generate color arrangements with greater regularity and hence more revealing moirepatterns than than the arrangements yielded by quenched spin-glass processes. Attention is given to the perceptual implications of these results.

  2. Structure of turbulent non-premixed flames modeled with two-step chemistry

    NASA Technical Reports Server (NTRS)

    Chen, J. H.; Mahalingam, S.; Puri, I. K.; Vervisch, L.

    1992-01-01

    Direct numerical simulations of turbulent diffusion flames modeled with finite-rate, two-step chemistry, A + B yields I, A + I yields P, were carried out. A detailed analysis of the turbulent flame structure reveals the complex nature of the penetration of various reactive species across two reaction zones in mixture fraction space. Due to this two zone structure, these flames were found to be robust, resisting extinction over the parameter ranges investigated. As in single-step computations, mixture fraction dissipation rate and the mixture fraction were found to be statistically correlated. Simulations involving unequal molecular diffusivities suggest that the small scale mixing process and, hence, the turbulent flame structure is sensitive to the Schmidt number.

  3. Tuning the critical solution temperature of polymers by copolymerization

    NASA Astrophysics Data System (ADS)

    Schulz, Bernhard; Chudoba, Richard; Heyda, Jan; Dzubiella, Joachim

    2015-12-01

    We study statistical copolymerization effects on the upper critical solution temperature (CST) of generic homopolymers by means of coarse-grained Langevin dynamics computer simulations and mean-field theory. Our systematic investigation reveals that the CST can change monotonically or non-monotonically with copolymerization, as observed in experimental studies, depending on the degree of non-additivity of the monomer (A-B) cross-interactions. The simulation findings are confirmed and qualitatively explained by a combination of a two-component Flory-de Gennes model for polymer collapse and a simple thermodynamic expansion approach. Our findings provide some rationale behind the effects of copolymerization and may be helpful for tuning CST behavior of polymers in soft material design.

  4. A novel pH-responsive interpolyelectrolyte hydrogel complex for the oral delivery of levodopa. Part I. IPEC modeling and synthesis.

    PubMed

    Ngwuluka, Ndidi C; Choonara, Yahya E; Kumar, Pradeep; du Toit, Lisa C; Khan, Riaz A; Pillay, Viness

    2015-03-01

    This study was undertaken to synthesize an interpolyelectrolyte complex (IPEC) of polymethacrylate (E100) and sodium carboxymethylcellulose (NaCMC) to form a polymeric hydrogel material for application in specialized oral drug delivery of sensitive levodopa. Computational modeling was employed to proffer insight into the interactions between the polymers. In addition, the reactional profile of NaCMC and polymethacrylate was elucidated using molecular mechanics energy relationships (MMER) and molecular dynamics simulations (MDS) by exploring the spatial disposition of NaCMC and E100 with respect to each other. Computational modeling revealed that the formation of the IPEC was due to strong ionic associations, hydrogen bonding, and hydrophilic interactions. The computational results corroborated well with the experimental and the analytical data. © 2014 Wiley Periodicals, Inc.

  5. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    ERIC Educational Resources Information Center

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  6. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  7. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  8. Reversible simulation of irreversible computation

    NASA Astrophysics Data System (ADS)

    Li, Ming; Tromp, John; Vitányi, Paul

    1998-09-01

    Computer computations are generally irreversible while the laws of physics are reversible. This mismatch is penalized by among other things generating excess thermic entropy in the computation. Computing performance has improved to the extent that efficiency degrades unless all algorithms are executed reversibly, for example by a universal reversible simulation of irreversible computations. All known reversible simulations are either space hungry or time hungry. The leanest method was proposed by Bennett and can be analyzed using a simple ‘reversible’ pebble game. The reachable reversible simulation instantaneous descriptions (pebble configurations) of such pebble games are characterized completely. As a corollary we obtain the reversible simulation by Bennett and, moreover, show that it is a space-optimal pebble game. We also introduce irreversible steps and give a theorem on the tradeoff between the number of allowed irreversible steps and the memory gain in the pebble game. In this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. The reversible simulation can be modified so that it is applicable also when the simulated computation time is unknown.

  9. Massively parallel quantum computer simulator

    NASA Astrophysics Data System (ADS)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.

  10. Molecular dynamics simulation reveals how phosphorylation of tyrosine 26 of phosphoglycerate mutase 1 upregulates glycolysis and promotes tumor growth.

    PubMed

    Wang, Yan; Cai, Wen-Sheng; Chen, Luonan; Wang, Guanyu

    2017-02-14

    Phosphoglycerate mutase 1 (PGAM1) catalyzes the eighth step of glycolysis and is often found upregulated in cancer cells. To test the hypothesis that the phosphorylation of tyrosine 26 residue of PGAM1 greatly enhances its activity, we performed both conventional and steered molecular dynamics simulations on the binding and unbinding of PGAM1 to its substrates, with tyrosine 26 either phosphorylated or not. We analyzed the simulated data in terms of structural stability, hydrogen bond formation, binding free energy, etc. We found that tyrosine 26 phosphorylation enhances the binding of PGAM1 to its substrates through generating electrostatic environment and structural features that are advantageous to the binding. Our results may provide valuable insights into computer-aided design of drugs that specifically target cancer cells with PGAM1 tyrosine 26 phosphorylated.

  11. Mechanosensitive Channels: Insights from Continuum-Based Simulations

    PubMed Central

    Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun; Cui, Qiang; Chen, Xi

    2009-01-01

    Mechanotransduction plays an important role in regulating cell functions and it is an active topic of research in biophysics. Despite recent advances in experimental and numerical techniques, the intrinsic multiscale nature imposes tremendous challenges for revealing the working mechanisms of mechanosensitive channels. Recently, a continuum-mechanics based hierarchical modeling and simulation framework has been established and applied to study the mechanical responses and gating behaviors of a prototypical mechanosensitive channel, the mechanosensitive channel of large conductance (MscL) in bacteria Escherichia coli (E. coli), from which several putative gating mechanisms have been tested and new insights deduced. This article reviews these latest findings using the continuum mechanics framework and suggests possible improvements for future simulation studies. This computationally efficient and versatile continuum-mechanics based protocol is poised to make contributions to the study of a variety of mechanobiology problems. PMID:18787764

  12. Advancements in Electromagnetic Wave Backscattering Simulations: Applications in Active Lidar Remote Sensing Involving Aerosols

    NASA Astrophysics Data System (ADS)

    Bi, L.

    2016-12-01

    Atmospheric remote sensing based on the Lidar technique fundamentally relies on knowledge of the backscattering of light by particulate matters in the atmosphere. This talk starts with a review of the current capabilities of electromagnetic wave scattering simulations to determine the backscattering optical properties of irregular particles, such as the backscatterer and depolarization ratio. This will be followed by a discussion of possible pitfalls in the relevant simulations. The talk will then be concluded with reports on the latest advancements in computational techniques. In addition, we summarize the laws of the backscattering optical properties of aerosols with respect to particle geometries, particle sizes, and mixing rules. These advancements will be applied to the analysis of the Lidar observation data to reveal the state and possible microphysical processes of various aerosols.

  13. Study of Nanocomposites of Amino Acids and Organic Polyethers by Means of Mass Spectrometry and Molecular Dynamics Simulation

    NASA Astrophysics Data System (ADS)

    Zobnina, V. G.; Kosevich, M. V.; Chagovets, V. V.; Boryak, O. A.

    A problem of elucidation of structure of nanomaterials based on combination of proteins and polyether polymers is addressed on the monomeric level of single amino acids and oligomers of PEG-400 and OEG-5 polyethers. Efficiency of application of combined approach involving experimental electrospray mass spectrometry and computer modeling by molecular dynamics simulation is demonstrated. It is shown that oligomers of polyethers form stable complexes with amino acids valine, proline, histidine, glutamic, and aspartic acids. Molecular dynamics simulation has shown that stabilization of amino acid-polyether complexes is achieved due to winding of the polymeric chain around charged groups of amino acids. Structural motives revealed for complexes of single amino acids with polyethers can be realized in structures of protein-polyether nanoparticles currently designed for drug delivery.

  14. A review of the analytical simulation of aircraft crash dynamics

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Carden, Huey D.; Boitnott, Richard L.; Hayduk, Robert J.

    1990-01-01

    A large number of full scale tests of general aviation aircraft, helicopters, and one unique air-to-ground controlled impact of a transport aircraft were performed. Additionally, research was also conducted on seat dynamic performance, load-limiting seats, load limiting subfloor designs, and emergency-locator-transmitters (ELTs). Computer programs were developed to provide designers with methods for predicting accelerations, velocities, and displacements of collapsing structure and for estimating the human response to crash loads. The results of full scale aircraft and component tests were used to verify and guide the development of analytical simulation tools and to demonstrate impact load attenuating concepts. Analytical simulation of metal and composite aircraft crash dynamics are addressed. Finite element models are examined to determine their degree of corroboration by experimental data and to reveal deficiencies requiring further development.

  15. Composition and Manufacturing Effects on Electrical Conductivity of Li/FeS 2 Thermal Battery Cathodes

    DOE PAGES

    Reinholz, Emilee L.; Roberts, Scott A.; Apblett, Christopher A.; ...

    2016-06-11

    The electrical conductivity is key to the performance of thermal battery cathodes. In this work we present the effects of manufacturing and processing conditions on the electrical conductivity of Li/FeS2 thermal battery cathodes. Finite element simulations were used to compute the conductivity of three-dimensional microcomputed tomography cathode microstructures and compare results to experimental impedance spectroscopy measurements. A regression analysis reveals a predictive relationship between composition, processing conditions, and electrical conductivity; a trend which is largely erased after thermally-induced deformation. Moreover, the trend applies to both experimental and simulation results, although is not as apparent in simulations. This research is amore » step toward a more fundamental understanding of the effects of processing and composition on thermal battery component microstructure, properties, and performance.« less

  16. Computational analysis of nonlinearities within dynamics of cable-based driving systems

    NASA Astrophysics Data System (ADS)

    Anghelache, G. D.; Nastac, S.

    2017-08-01

    This paper deals with computational nonlinear dynamics of mechanical systems containing some flexural parts within the actuating scheme, and, especially, the situations of the cable-based driving systems were treated. It was supposed both functional nonlinearities and the real characteristic of the power supply, in order to obtain a realistically computer simulation model being able to provide very feasible results regarding the system dynamics. It was taken into account the transitory and stable regimes during a regular exploitation cycle. The authors present a particular case of a lift system, supposed to be representatively for the objective of this study. The simulations were made based on the values of the essential parameters acquired from the experimental tests and/or the regular practice in the field. The results analysis and the final discussions reveal the correlated dynamic aspects within the mechanical parts, the driving system, and the power supply, whole of these supplying potential sources of particular resonances, within some transitory phases of the working cycle, and which can affect structural and functional dynamics. In addition, it was underlines the influences of computational hypotheses on the both quantitative and qualitative behaviour of the system. Obviously, the most significant consequence of this theoretical and computational research consist by developing an unitary and feasible model, useful to dignify the nonlinear dynamic effects into the systems with cable-based driving scheme, and hereby to help an optimization of the exploitation regime including a dynamics control measures.

  17. Composite self-expanding bioresorbable prototype stents with reinforced compression performance for congenital heart disease application: Computational and experimental investigation.

    PubMed

    Zhao, Fan; Xue, Wen; Wang, Fujun; Liu, Laijun; Shi, Haoqin; Wang, Lu

    2018-08-01

    Stents are vital devices to treat vascular stenosis in pediatric patients with congenital heart disease. Bioresorbable stents (BRSs) have been applied to reduce challenging complications caused by permanent metal stents. However, it remains almost a total lack of BRSs with satisfactory compression performance specifically for children with congenital heart disease, leading to importantly suboptimal effects. In this work, composite bioresorbable prototype stents with superior compression resistance were designed by braiding and annealing technology, incorporating poly (p-dioxanone) (PPDO) monofilaments and polycaprolactone (PCL) multifilament. Stent prototype compression properties were investigated. The results revealed that novel composite prototype stents showed superior compression force compared to the control ones, as well as recovery ability. Furthermore, deformation mechanisms were analyzed by computational simulation, which revealed bonded interlacing points among yarns play an important role. This research presents important clinical implications in bioresorbable stent manufacture and provides further study with an innovative stent design. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Relativistic Channeling of a Picosecond Laser Pulse in a Near-Critical Preformed Plasma

    NASA Astrophysics Data System (ADS)

    Borghesi, M.; MacKinnon, A. J.; Barringer, L.; Gaillard, R.; Gizzi, L. A.; Meyer, C.; Willi, O.; Pukhov, A.; Meyer-Ter-Vehn, J.

    1997-02-01

    Relativistic self-channeling of a picosecond laser pulse in a preformed plasma near critical density has been observed both experimentally and in 3D particle-in-cell simulations. Optical probing measurements indicate the formation of a single pulsating propagation channel, typically of about 5 μm in diameter. The computational results reveal the importance in the channel formation of relativistic electrons traveling with the light pulse and of the corresponding self-generated magnetic field.

  19. Remote Sensing of Aquatic Plants.

    DTIC Science & Technology

    1979-10-01

    remote sensing methods for identification and assessment of expanses of aquatic plants. Both materials and techniques are examined for cost effectiveness and capability to sense aquatic plants on both the local and regional scales. Computer simulation of photographic responses was employed; Landsat, high-altitude photography, side-looking airborne radar, and low-altitude photography were examined to determine the capabilities of each for identifying and assessing aquatic plants. Results of the study revealed Landsat to be the most cost effective for regional surveys,

  20. A salt bridge turns off the foot-pocket in class-II HDACs.

    PubMed

    Zhou, Jingwei; Yang, Zuolong; Zhang, Fan; Luo, Hai-Bin; Li, Min; Wu, Ruibo

    2016-08-21

    Histone Deacetylases (HDACs) are promising anticancer targets and several selective inhibitors have been created based on the architectural differences of foot-pockets among HDACs. However, the "gate-keeper" of foot-pockets is still controversial. Herein, it is for the first time revealed that a conserved R-E salt bridge plays a critical role in keeping foot-pockets closed in class-II HDACs by computational simulations. This finding is further substantiated by our mutagenesis experiments.

  1. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  2. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  3. Solution-limited time stepping method and numerical simulation of single-element rocket engine combustor

    NASA Astrophysics Data System (ADS)

    Lian, Chenzhou

    The focus of the research is to gain a better understanding of the mixing and combustion of propellants in a confined single element rocket engine combustor. The approach taken is to use the unsteady computational simulations of both liquid and gaseous oxygen reacting with gaseous hydrogen to study the effects of transient processes, recirculation regions and density variations under supercritical conditions. The physics of combustion involve intimate coupling between fluid dynamics, chemical kinetics and intense energy release and take place over an exceptionally wide range of scales. In the face of these monumental challenges, it remains the engineer's task to find acceptable simulation approach and reliable CFD algorithm for combustion simulations. To provide the computational robustness to allow detailed analyses of such complex problems, we start by investigating a method for enhancing the reliability of implicit computational algorithms and decreasing their sensitivity to initial conditions without adversely impacting their efficiency. Efficient convergence is maintained by specifying a large global CFL number while reliability is improved by limiting the local CFL number such that the solution change in any cell is less than a specified tolerance. The magnitude of the solution change is estimated from the calculated residual in a manner that requires negligible computational time. The method precludes unphysical excursions in Newton-like iterations in highly non-linear regions where Jacobians are changing rapidly as well as non-physical results during the computation. The method is tested against a series of problems to identify its characteristics and to verify the approach. The results reveal a substantial improvement in convergence reliability of implicit CFD applications that enables computations starting from simple initial conditions. The method is applied in the unsteady combustion simulations and allows long time running of the code without user intervention. The initial transient leading to stationary conditions in unsteady combustion simulations is investigated by considering flow establishment in model combustors. The duration of the transient is shown to be dependent on the characteristic turn-over time for recirculation zones and the time for the chamber pressure to reach steady conditions. Representative comparisons of the time-averaged, stationary results with experiment are presented to document the computations. The flow dynamics and combustion for two sizes of chamber diameters and two different wall thermal boundary conditions are investigated to assess the role of the recirculation regions on the mixing/combustion process in rocket engine combustors. Results are presented in terms of both instantaneous and time-averaged solutions. As a precursor to liquid oxygen/gaseous hydrogen (LO2/GH 2) combustion simulations, the evolution of a liquid nitrogen (LN 2) jet initially at a subcritical temperature and injected into a supercritical environment is first investigated and the results are validated against experimental data. Unsteady simulations of non-reacting LO2/GH 2 are then performed for a single element shear coaxial injector. These cold flow calculations are then extended to reacting LO2/GH 2 flows to demonstrate the capability of the numerical procedure for high-density-gradient supercritical reacting flows.

  4. Computer Support of Operator Training: Constructing and Testing a Prototype of a CAL (Computer Aided Learning) Supported Simulation Environment.

    ERIC Educational Resources Information Center

    Zillesen, P. G. van Schaick; And Others

    Instructional feedback given to the learners during computer simulation sessions may be greatly improved by integrating educational computer simulation programs with hypermedia-based computer-assisted learning (CAL) materials. A prototype of a learning environment of this type called BRINE PURIFICATION was developed for use in corporate training…

  5. A Catalytic Mechanism for Cysteine N-Terminal Nucleophile Hydrolases, as Revealed by Free Energy Simulations

    PubMed Central

    Lodola, Alessio; Branduardi, Davide; De Vivo, Marco; Capoferri, Luigi; Mor, Marco; Piomelli, Daniele; Cavalli, Andrea

    2012-01-01

    The N-terminal nucleophile (Ntn) hydrolases are a superfamily of enzymes specialized in the hydrolytic cleavage of amide bonds. Even though several members of this family are emerging as innovative drug targets for cancer, inflammation, and pain, the processes through which they catalyze amide hydrolysis remains poorly understood. In particular, the catalytic reactions of cysteine Ntn-hydrolases have never been investigated from a mechanistic point of view. In the present study, we used free energy simulations in the quantum mechanics/molecular mechanics framework to determine the reaction mechanism of amide hydrolysis catalyzed by the prototypical cysteine Ntn-hydrolase, conjugated bile acid hydrolase (CBAH). The computational analyses, which were confirmed in water and using different CBAH mutants, revealed the existence of a chair-like transition state, which might be one of the specific features of the catalytic cycle of Ntn-hydrolases. Our results offer new insights on Ntn-mediated hydrolysis and suggest possible strategies for the creation of therapeutically useful inhibitors. PMID:22389698

  6. A computationally efficient Bayesian sequential simulation approach for the assimilation of vast and diverse hydrogeophysical datasets

    NASA Astrophysics Data System (ADS)

    Nussbaumer, Raphaël; Gloaguen, Erwan; Mariéthoz, Grégoire; Holliger, Klaus

    2016-04-01

    Bayesian sequential simulation (BSS) is a powerful geostatistical technique, which notably has shown significant potential for the assimilation of datasets that are diverse with regard to the spatial resolution and their relationship. However, these types of applications of BSS require a large number of realizations to adequately explore the solution space and to assess the corresponding uncertainties. Moreover, such simulations generally need to be performed on very fine grids in order to adequately exploit the technique's potential for characterizing heterogeneous environments. Correspondingly, the computational cost of BSS algorithms in their classical form is very high, which so far has limited an effective application of this method to large models and/or vast datasets. In this context, it is also important to note that the inherent assumption regarding the independence of the considered datasets is generally regarded as being too strong in the context of sequential simulation. To alleviate these problems, we have revisited the classical implementation of BSS and incorporated two key features to increase the computational efficiency. The first feature is a combined quadrant spiral - superblock search, which targets run-time savings on large grids and adds flexibility with regard to the selection of neighboring points using equal directional sampling and treating hard data and previously simulated points separately. The second feature is a constant path of simulation, which enhances the efficiency for multiple realizations. We have also modified the aggregation operator to be more flexible with regard to the assumption of independence of the considered datasets. This is achieved through log-linear pooling, which essentially allows for attributing weights to the various data components. Finally, a multi-grid simulating path was created to enforce large-scale variance and to allow for adapting parameters, such as, for example, the log-linear weights or the type of simulation path at various scales. The newly implemented search method for kriging reduces the computational cost from an exponential dependence with regard to the grid size in the original algorithm to a linear relationship, as each neighboring search becomes independent from the grid size. For the considered examples, our results show a sevenfold reduction in run time for each additional realization when a constant simulation path is used. The traditional criticism that constant path techniques introduce a bias to the simulations was explored and our findings do indeed reveal a minor reduction in the diversity of the simulations. This bias can, however, be largely eliminated by changing the path type at different scales through the use of the multi-grid approach. Finally, we show that adapting the aggregation weight at each scale considered in our multi-grid approach allows for reproducing both the variogram and histogram, and the spatial trend of the underlying data.

  7. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  8. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  9. Density Functional Computations and Molecular Dynamics Simulations of the Triethylammonium Triflate Protic Ionic Liquid.

    PubMed

    Mora Cardozo, Juan F; Burankova, T; Embs, J P; Benedetto, A; Ballone, P

    2017-12-21

    Systematic molecular dynamics simulations based on an empirical force field have been carried out for samples of triethylammonium trifluoromethanesulfonate (triethylammonium triflate, [TEA][Tf]), covering a wide temperature range 200 K ≤ T ≤ 400 K and analyzing a broad set of properties, from self-diffusion and electrical conductivity to rotational relaxation and hydrogen-bond dynamics. The study is motivated by recent quasi-elastic neutron scattering and differential scanning calorimetry measurements on the same system, revealing two successive first order transitions at T ≈ 230 and 310 K (on heating), as well as an intriguing and partly unexplained variety of subdiffusive motions of the acidic proton. Simulations show a weakly discontinuous transition at T = 310 K and highlight an anomaly at T = 260 K in the rotational relaxation of ions that we identify with the simulation analogue of the experimental transition at T = 230 K. Thus, simulations help identifying the nature of the experimental transitions, confirming that the highest temperature one corresponds to melting, while the one taking place at lower T is a transition from the crystal, stable at T ≤ 260 K, to a plastic phase (260 ≤ T ≤ 310 K), in which molecules are able to rotate without diffusing. Rotations, in particular, account for the subdiffusive motion seen at intermediate T both in the experiments and in the simulation. The structure, distribution, and strength of hydrogen bonds are investigated by molecular dynamics and by density functional computations. Clustering of ions of the same sign and the effect of contamination by water at 1% wgt concentration are discussed as well.

  10. Accelerating simulation for the multiple-point statistics algorithm using vector quantization

    NASA Astrophysics Data System (ADS)

    Zuo, Chen; Pan, Zhibin; Liang, Hao

    2018-03-01

    Multiple-point statistics (MPS) is a prominent algorithm to simulate categorical variables based on a sequential simulation procedure. Assuming training images (TIs) as prior conceptual models, MPS extracts patterns from TIs using a template and records their occurrences in a database. However, complex patterns increase the size of the database and require considerable time to retrieve the desired elements. In order to speed up simulation and improve simulation quality over state-of-the-art MPS methods, we propose an accelerating simulation for MPS using vector quantization (VQ), called VQ-MPS. First, a variable representation is presented to make categorical variables applicable for vector quantization. Second, we adopt a tree-structured VQ to compress the database so that stationary simulations are realized. Finally, a transformed template and classified VQ are used to address nonstationarity. A two-dimensional (2D) stationary channelized reservoir image is used to validate the proposed VQ-MPS. In comparison with several existing MPS programs, our method exhibits significantly better performance in terms of computational time, pattern reproductions, and spatial uncertainty. Further demonstrations consist of a 2D four facies simulation, two 2D nonstationary channel simulations, and a three-dimensional (3D) rock simulation. The results reveal that our proposed method is also capable of solving multifacies, nonstationarity, and 3D simulations based on 2D TIs.

  11. Computer-aided Instructional System for Transmission Line Simulation.

    ERIC Educational Resources Information Center

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  12. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    NASA Astrophysics Data System (ADS)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  13. Criteria for Modeling in LES of Multicomponent Fuel Flow

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Selle, Laurent

    2009-01-01

    A report presents a study addressing the question of which large-eddy simulation (LES) equations are appropriate for modeling the flow of evaporating drops of a multicomponent liquid in a gas (e.g., a spray of kerosene or diesel fuel in air). The LES equations are obtained from the direct numerical simulation (DNS) equations in which the solution is computed at all flow length scales, by applying a spatial low-pass filter. Thus, in LES the small scales are removed and replaced by terms that cannot be computed from the LES solution and instead must be modeled to retain the effect of the small scales into the equations. The mathematical form of these models is a subject of contemporary research. For a single-component liquid, there is only one LES formulation, but this study revealed that for a multicomponent liquid, there are two non-equivalent LES formulations for the conservation equations describing the composition of the vapor. Criteria were proposed for selecting the multicomponent LES formulation that gives the best accuracy and increased computational efficiency. These criteria were applied in examination of filtered DNS databases to compute the terms in the LES equations. The DNS databases are from mixing layers of diesel and kerosene fuels. The comparisons resulted in the selection of one of the multicomponent LES formulations as the most promising with respect to all criteria.

  14. Self-consistent gyrokinetic modeling of neoclassical and turbulent impurity transport

    NASA Astrophysics Data System (ADS)

    Estève, D.; Sarazin, Y.; Garbet, X.; Grandgirard, V.; Breton, S.; Donnel, P.; Asahi, Y.; Bourdelle, C.; Dif-Pradalier, G.; Ehrlacher, C.; Emeriau, C.; Ghendrih, Ph.; Gillot, C.; Latu, G.; Passeron, C.

    2018-03-01

    Trace impurity transport is studied with the flux-driven gyrokinetic GYSELA code (Grandgirard et al 2016 Comput. Phys. Commun. 207 35). A reduced and linearized multi-species collision operator has been recently implemented, so that both neoclassical and turbulent transport channels can be treated self-consistently on an equal footing. In the Pfirsch-Schlüter regime that is probably relevant for tungsten, the standard expression for the neoclassical impurity flux is shown to be recovered from gyrokinetics with the employed collision operator. Purely neoclassical simulations of deuterium plasma with trace impurities of helium, carbon and tungsten lead to impurity diffusion coefficients, inward pinch velocities due to density peaking, and thermo-diffusion terms which quantitatively agree with neoclassical predictions and NEO simulations (Belli et al 2012 Plasma Phys. Control. Fusion 54 015015). The thermal screening factor appears to be less than predicted analytically in the Pfirsch-Schlüter regime, which can be detrimental to fusion performance. Finally, self-consistent nonlinear simulations have revealed that the tungsten impurity flux is not the sum of turbulent and neoclassical fluxes computed separately, as is usually assumed. The synergy partly results from the turbulence-driven in-out poloidal asymmetry of tungsten density. This result suggests the need for self-consistent simulations of impurity transport, i.e. including both turbulence and neoclassical physics, in view of quantitative predictions for ITER.

  15. Insights from molecular dynamics simulations for computational protein design.

    PubMed

    Childers, Matthew Carter; Daggett, Valerie

    2017-02-01

    A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures.

  16. Hybrid Large Eddy Simulation / Reynolds Averaged Navier-Stokes Modeling in Directed Energy Applications

    NASA Astrophysics Data System (ADS)

    Zilberter, Ilya Alexandrovich

    In this work, a hybrid Large Eddy Simulation / Reynolds-Averaged Navier Stokes (LES/RANS) turbulence model is applied to simulate two flows relevant to directed energy applications. The flow solver blends the Menter Baseline turbulence closure near solid boundaries with a Lenormand-type subgrid model in the free-stream with a blending function that employs the ratio of estimated inner and outer turbulent length scales. A Mach 2.2 mixing nozzle/diffuser system representative of a gas laser is simulated under a range of exit pressures to assess the ability of the model to predict the dynamics of the shock train. The simulation captures the location of the shock train responsible for pressure recovery but under-predicts the rate of pressure increase. Predicted turbulence production at the wall is found to be highly sensitive to the behavior of the RANS turbulence model. A Mach 2.3, high-Reynolds number, three-dimensional cavity flow is also simulated in order to compute the wavefront aberrations of an optical beam passing thorough the cavity. The cavity geometry is modeled using an immersed boundary method, and an auxiliary flat plate simulation is performed to replicate the effects of the wind-tunnel boundary layer on the computed optical path difference. Pressure spectra extracted on the cavity walls agree with empirical predictions based on Rossiter's formula. Proper orthogonal modes of the wavefront aberrations in a beam originating from the cavity center agree well with experimental data despite uncertainty about in flow turbulence levels and boundary layer thicknesses over the wind tunnel window. Dynamic mode decomposition of a planar wavefront spanning the cavity reveals that wavefront distortions are driven by shear layer oscillations at the Rossiter frequencies; these disturbances create eddy shocklets that propagate into the free-stream, creating additional optical wavefront distortion.

  17. Insights from molecular dynamics simulations for computational protein design

    PubMed Central

    Childers, Matthew Carter; Daggett, Valerie

    2017-01-01

    A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures. PMID:28239489

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Huan; Cheng, Liang; Chuah, Mooi Choo

    In the generation, transmission, and distribution sectors of the smart grid, intelligence of field devices is realized by programmable logic controllers (PLCs). Many smart-grid subsystems are essentially cyber-physical energy systems (CPES): For instance, the power system process (i.e., the physical part) within a substation is monitored and controlled by a SCADA network with hosts running miscellaneous applications (i.e., the cyber part). To study the interactions between the cyber and physical components of a CPES, several co-simulation platforms have been proposed. However, the network simulators/emulators of these platforms do not include a detailed traffic model that takes into account the impactsmore » of the execution model of PLCs on traffic characteristics. As a result, network traces generated by co-simulation only reveal the impacts of the physical process on the contents of the traffic generated by SCADA hosts, whereas the distinction between PLCs and computing nodes (e.g., a hardened computer running a process visualization application) has been overlooked. To generate realistic network traces using co-simulation for the design and evaluation of applications relying on accurate traffic profiles, it is necessary to establish a traffic model for PLCs. In this work, we propose a parameterized model for PLCs that can be incorporated into existing co-simulation platforms. We focus on the DNP3 subsystem of slave PLCs, which automates the processing of packets from the DNP3 master. To validate our approach, we extract model parameters from both the configuration and network traces of real PLCs. Simulated network traces are generated and compared against those from PLCs. Our evaluation shows that our proposed model captures the essential traffic characteristics of DNP3 slave PLCs, which can be used to extend existing co-simulation platforms and gain further insights into the behaviors of CPES.« less

  19. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    USDA-ARS?s Scientific Manuscript database

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  20. A meta-analysis of outcomes from the use of computer-simulated experiments in science education

    NASA Astrophysics Data System (ADS)

    Lejeune, John Van

    The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.

  1. Combining Experiments and Simulations Using the Maximum Entropy Principle

    PubMed Central

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges. PMID:24586124

  2. Core-collapse supernovae as supercomputing science: A status report toward six-dimensional simulations with exact Boltzmann neutrino transport in full general relativity

    NASA Astrophysics Data System (ADS)

    Kotake, Kei; Sumiyoshi, Kohsuke; Yamada, Shoichi; Takiwaki, Tomoya; Kuroda, Takami; Suwa, Yudai; Nagakura, Hiroki

    2012-08-01

    This is a status report on our endeavor to reveal the mechanism of core-collapse supernovae (CCSNe) by large-scale numerical simulations. Multi-dimensionality of the supernova engine, general relativistic magnetohydrodynamics, energy and lepton number transport by neutrinos emitted from the forming neutron star, as well as nuclear interactions there, are all believed to play crucial roles in repelling infalling matter and producing energetic explosions. These ingredients are non-linearly coupled with one another in the dynamics of core collapse, bounce, and shock expansion. Serious quantitative studies of CCSNe hence make extensive numerical computations mandatory. Since neutrinos are neither in thermal nor in chemical equilibrium in general, their distributions in the phase space should be computed. This is a six-dimensional (6D) neutrino transport problem and quite a challenge, even for those with access to the most advanced numerical resources such as the "K computer". To tackle this problem, we have embarked on efforts on multiple fronts. In particular, we report in this paper our recent progresses in the treatment of multidimensional (multi-D) radiation hydrodynamics. We are currently proceeding on two different paths to the ultimate goal. In one approach, we employ an approximate but highly efficient scheme for neutrino transport and treat 3D hydrodynamics and/or general relativity rigorously; some neutrino-driven explosions will be presented and quantitative comparisons will be made between 2D and 3D models. In the second approach, on the other hand, exact, but so far Newtonian, Boltzmann equations are solved in two and three spatial dimensions; we will show some example test simulations. We will also address the perspectives of exascale computations on the next generation supercomputers.

  3. Dynamical Approach Study of Spurious Numerics in Nonlinear Computations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Mansour, Nagi (Technical Monitor)

    2002-01-01

    The last two decades have been an era when computation is ahead of analysis and when very large scale practical computations are increasingly used in poorly understood multiscale complex nonlinear physical problems and non-traditional fields. Ensuring a higher level of confidence in the predictability and reliability (PAR) of these numerical simulations could play a major role in furthering the design, understanding, affordability and safety of our next generation air and space transportation systems, and systems for planetary and atmospheric sciences, and in understanding the evolution and origin of life. The need to guarantee PAR becomes acute when computations offer the ONLY way of solving these types of data limited problems. Employing theory from nonlinear dynamical systems, some building blocks to ensure a higher level of confidence in PAR of numerical simulations have been revealed by the author and world expert collaborators in relevant fields. Five building blocks with supporting numerical examples were discussed. The next step is to utilize knowledge gained by including nonlinear dynamics, bifurcation and chaos theories as an integral part of the numerical process. The third step is to design integrated criteria for reliable and accurate algorithms that cater to the different multiscale nonlinear physics. This includes but is not limited to the construction of appropriate adaptive spatial and temporal discretizations that are suitable for the underlying governing equations. In addition, a multiresolution wavelets approach for adaptive numerical dissipation/filter controls for high speed turbulence, acoustics and combustion simulations will be sought. These steps are corner stones for guarding against spurious numerical solutions that are solutions of the discretized counterparts but are not solutions of the underlying governing equations.

  4. Entrance surface dose distribution and organ dose assessment for cone-beam computed tomography using measurements and Monte Carlo simulations with voxel phantoms

    NASA Astrophysics Data System (ADS)

    Baptista, M.; Di Maria, S.; Vieira, S.; Vaz, P.

    2017-11-01

    Cone-Beam Computed Tomography (CBCT) enables high-resolution volumetric scanning of the bone and soft tissue anatomy under investigation at the treatment accelerator. This technique is extensively used in Image Guided Radiation Therapy (IGRT) for pre-treatment verification of patient position and target volume localization. When employed daily and several times per patient, CBCT imaging may lead to high cumulative imaging doses to the healthy tissues surrounding the exposed organs. This work aims at (1) evaluating the dose distribution during a CBCT scan and (2) calculating the organ doses involved in this image guiding procedure for clinically available scanning protocols. Both Monte Carlo (MC) simulations and measurements were performed. To model and simulate the kV imaging system mounted on a linear accelerator (Edge™, Varian Medical Systems) the state-of-the-art MC radiation transport program MCNPX 2.7.0 was used. In order to validate the simulation results, measurements of the Computed Tomography Dose Index (CTDI) were performed, using standard PMMA head and body phantoms, with 150 mm length and a standard pencil ionizing chamber (IC) 100 mm long. Measurements for head and pelvis scanning protocols, usually adopted in clinical environment were acquired, using two acquisition modes (full-fan and half fan). To calculate the organ doses, the implemented MC model of the CBCT scanner together with a male voxel phantom ("Golem") was used. The good agreement between the MCNPX simulations and the CTDIw measurements (differences up to 17%) presented in this work reveals that the CBCT MC model was successfully validated, taking into account the several uncertainties. The adequacy of the computational model to map dose distributions during a CBCT scan is discussed in order to identify ways to reduce the total CBCT imaging dose. The organ dose assessment highlights the need to evaluate the therapeutic and the CBCT imaging doses, in a more balanced approach, and the importance of improving awareness regarding the increased risk, arising from repeated exposures.

  5. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  6. Computational replication of the patient-specific stenting procedure for coronary artery bifurcations: From OCT and CT imaging to structural and hemodynamics analyses.

    PubMed

    Chiastra, Claudio; Wu, Wei; Dickerhoff, Benjamin; Aleiou, Ali; Dubini, Gabriele; Otake, Hiromasa; Migliavacca, Francesco; LaDisa, John F

    2016-07-26

    The optimal stenting technique for coronary artery bifurcations is still debated. With additional advances computational simulations can soon be used to compare stent designs or strategies based on verified structural and hemodynamics results in order to identify the optimal solution for each individual's anatomy. In this study, patient-specific simulations of stent deployment were performed for 2 cases to replicate the complete procedure conducted by interventional cardiologists. Subsequent computational fluid dynamics (CFD) analyses were conducted to quantify hemodynamic quantities linked to restenosis. Patient-specific pre-operative models of coronary bifurcations were reconstructed from CT angiography and optical coherence tomography (OCT). Plaque location and composition were estimated from OCT and assigned to models, and structural simulations were performed in Abaqus. Artery geometries after virtual stent expansion of Xience Prime or Nobori stents created in SolidWorks were compared to post-operative geometry from OCT and CT before being extracted and used for CFD simulations in SimVascular. Inflow boundary conditions based on body surface area, and downstream vascular resistances and capacitances were applied at branches to mimic physiology. Artery geometries obtained after virtual expansion were in good agreement with those reconstructed from patient images. Quantitative comparison of the distance between reconstructed and post-stent geometries revealed a maximum difference in area of 20.4%. Adverse indices of wall shear stress were more pronounced for thicker Nobori stents in both patients. These findings verify structural analyses of stent expansion, introduce a workflow to combine software packages for solid and fluid mechanics analysis, and underscore important stent design features from prior idealized studies. The proposed approach may ultimately be useful in determining an optimal choice of stent and position for each patient. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  8. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    NASA Astrophysics Data System (ADS)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  9. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  10. Simulated transcatheter aortic valve deformation: A parametric study on the impact of leaflet geometry on valve peak stress.

    PubMed

    Li, Kewei; Sun, Wei

    2017-03-01

    In this study, we developed a computational framework to investigate the impact of leaflet geometry of a transcatheter aortic valve (TAV) on the leaflet stress distribution, aiming at optimizing TAV leaflet design to reduce its peak stress. Utilizing a generic TAV model developed previously [Li and Sun, Annals of Biomedical Engineering, 2010. 38(8): 2690-2701], we first parameterized the 2D leaflet geometry by mathematical equations, then by perturbing the parameters of the equations, we could automatically generate a new leaflet design, remesh the 2D leaflet model and build a 3D leaflet model from the 2D design via a Python script. Approximately 500 different leaflet designs were investigated by simulating TAV closure under the nominal circular deployment and physiological loading conditions. From the simulation results, we identified a new leaflet design that could reduce the previously reported valve peak stress by about 5%. The parametric analysis also revealed that increasing the free edge width had the highest overall impact on decreasing the peak stress. A similar computational analysis was further performed for a TAV deployed in an abnormal, asymmetric elliptical configuration. We found that a minimal free edge height of 0.46 mm should be adopted to prevent central backflow leakage. This increase of the free edge height resulted in an increase of the leaflet peak stress. Furthermore, the parametric study revealed a complex response surface for the impact of the leaflet geometric parameters on the peak stress, underscoring the importance of performing a numerical optimization to obtain the optimal TAV leaflet design. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Moving domain computational fluid dynamics to interface with an embryonic model of cardiac morphogenesis.

    PubMed

    Lee, Juhyun; Moghadam, Mahdi Esmaily; Kung, Ethan; Cao, Hung; Beebe, Tyler; Miller, Yury; Roman, Beth L; Lien, Ching-Ling; Chi, Neil C; Marsden, Alison L; Hsiai, Tzung K

    2013-01-01

    Peristaltic contraction of the embryonic heart tube produces time- and spatial-varying wall shear stress (WSS) and pressure gradients (∇P) across the atrioventricular (AV) canal. Zebrafish (Danio rerio) are a genetically tractable system to investigate cardiac morphogenesis. The use of Tg(fli1a:EGFP) (y1) transgenic embryos allowed for delineation and two-dimensional reconstruction of the endocardium. This time-varying wall motion was then prescribed in a two-dimensional moving domain computational fluid dynamics (CFD) model, providing new insights into spatial and temporal variations in WSS and ∇P during cardiac development. The CFD simulations were validated with particle image velocimetry (PIV) across the atrioventricular (AV) canal, revealing an increase in both velocities and heart rates, but a decrease in the duration of atrial systole from early to later stages. At 20-30 hours post fertilization (hpf), simulation results revealed bidirectional WSS across the AV canal in the heart tube in response to peristaltic motion of the wall. At 40-50 hpf, the tube structure undergoes cardiac looping, accompanied by a nearly 3-fold increase in WSS magnitude. At 110-120 hpf, distinct AV valve, atrium, ventricle, and bulbus arteriosus form, accompanied by incremental increases in both WSS magnitude and ∇P, but a decrease in bi-directional flow. Laminar flow develops across the AV canal at 20-30 hpf, and persists at 110-120 hpf. Reynolds numbers at the AV canal increase from 0.07±0.03 at 20-30 hpf to 0.23±0.07 at 110-120 hpf (p< 0.05, n=6), whereas Womersley numbers remain relatively unchanged from 0.11 to 0.13. Our moving domain simulations highlights hemodynamic changes in relation to cardiac morphogenesis; thereby, providing a 2-D quantitative approach to complement imaging analysis.

  12. Oxygen environment and islet size are the primary limiting factors of isolated pancreatic islet survival

    PubMed Central

    Komatsu, Hirotake; Cook, Colin; Wang, Chia-Hao; Medrano, Leonard; Lin, Henry; Kandeel, Fouad; Tai, Yu-Chong; Mullen, Yoko

    2017-01-01

    Background Type 1 diabetes is an autoimmune disease that destroys insulin-producing beta cells in the pancreas. Pancreatic islet transplantation could be an effective treatment option for type 1 diabetes once several issues are resolved, including donor shortage, prevention of islet necrosis and loss in pre- and post-transplantation, and optimization of immunosuppression. This study seeks to determine the cause of necrotic loss of isolated islets to improve transplant efficiency. Methodology The oxygen tension inside isolated human islets of different sizes was simulated under varying oxygen environments using a computational in silico model. In vitro human islet viability was also assessed after culturing in different oxygen conditions. Correlation between simulation data and experimentally measured islet viability was examined. Using these in vitro viability data of human islets, the effect of islet diameter and oxygen tension of the culture environment on islet viability was also analyzed using a logistic regression model. Principal findings Computational simulation clearly revealed the oxygen gradient inside the islet structure. We found that oxygen tension in the islet core was greatly lower (hypoxic) than that on the islet surface due to the oxygen consumption by the cells. The hypoxic core was expanded in the larger islets or in lower oxygen cultures. These findings were consistent with results from in vitro islet viability assays that measured central necrosis in the islet core, indicating that hypoxia is one of the major causes of central necrosis. The logistic regression analysis revealed a negative effect of large islet and low oxygen culture on islet survival. Conclusions/Significance Hypoxic core conditions, induced by the oxygen gradient inside islets, contribute to the development of central necrosis of human isolated islets. Supplying sufficient oxygen during culture could be an effective and reasonable method to maintain isolated islets viable. PMID:28832685

  13. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  14. Quantum chemistry simulation on quantum computers: theories and experiments.

    PubMed

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  15. An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation

    NASA Technical Reports Server (NTRS)

    Bartos, R. D.

    1993-01-01

    Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.

  16. Comparative Implementation of High Performance Computing for Power System Dynamic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Huang, Zhenyu; Diao, Ruisheng

    Dynamic simulation for transient stability assessment is one of the most important, but intensive, computations for power system planning and operation. Present commercial software is mainly designed for sequential computation to run a single simulation, which is very time consuming with a single processer. The application of High Performance Computing (HPC) to dynamic simulations is very promising in accelerating the computing process by parallelizing its kernel algorithms while maintaining the same level of computation accuracy. This paper describes the comparative implementation of four parallel dynamic simulation schemes in two state-of-the-art HPC environments: Message Passing Interface (MPI) and Open Multi-Processing (OpenMP).more » These implementations serve to match the application with dedicated multi-processor computing hardware and maximize the utilization and benefits of HPC during the development process.« less

  17. The effectiveness of interactive computer simulations on college engineering student conceptual understanding and problem-solving ability related to circular motion

    NASA Astrophysics Data System (ADS)

    Chien, Cheng-Chih

    In the past thirty years, the effectiveness of computer assisted learning was found varied by individual studies. Today, with drastic technical improvement, computers have been widely spread in schools and used in a variety of ways. In this study, a design model involving educational technology, pedagogy, and content domain is proposed for effective use of computers in learning. Computer simulation, constructivist and Vygotskian perspectives, and circular motion are the three elements of the specific Chain Model for instructional design. The goal of the physics course is to help students remove the ideas which are not consistent with the physics community and rebuild new knowledge. To achieve the learning goal, the strategies of using conceptual conflicts and using language to internalize specific tasks into mental functions were included. Computer simulations and accompanying worksheets were used to help students explore their own ideas and to generate questions for discussions. Using animated images to describe the dynamic processes involved in the circular motion may reduce the complexity and possible miscommunications resulting from verbal explanations. The effectiveness of the instructional material on student learning is evaluated. The results of problem solving activities show that students using computer simulations had significantly higher scores than students not using computer simulations. For conceptual understanding, on the pretest students in the non-simulation group had significantly higher score than students in the simulation group. There was no significant difference observed between the two groups in the posttest. The relations of gender, prior physics experience, and frequency of computer uses outside the course to student achievement were also studied. There were fewer female students than male students and fewer students using computer simulations than students not using computer simulations. These characteristics affect the statistical power for detecting differences. For the future research, more intervention of simulations may be introduced to explore the potential of computer simulation in helping students learning. A test for conceptual understanding with more problems and appropriate difficulty level may be needed.

  18. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  19. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach wasmore » evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.« less

  20. Computational composite mechanics for aerospace propulsion structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial fabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating (1) complex composite structural behavior in general and (2) specific aerospace propulsion structural components in particular.

  1. Computational composite mechanics for aerospace propulsion structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1987-01-01

    Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial frabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating: (1) complex composite structural behavior in general, and (2) specific aerospace propulsion structural components in particular.

  2. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  3. MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1

    DTIC Science & Technology

    1971-05-01

    A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air

  4. Computational simulation of progressive fracture in fiber composites

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.

  5. Using Microcomputers Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    1985-01-01

    Presents a discussion of how computer simulations are used in two undergraduate social science courses and a faculty computer literacy course on simulations and artificial intelligence. Includes a list of 60 simulations for use on mainframes and microcomputers. Entries include type of hardware required, publisher's address, and cost. Sample…

  6. Learning Oceanography from a Computer Simulation Compared with Direct Experience at Sea

    ERIC Educational Resources Information Center

    Winn, William; Stahr, Frederick; Sarason, Christian; Fruland, Ruth; Oppenheimer, Peter; Lee, Yen-Ling

    2006-01-01

    Considerable research has compared how students learn science from computer simulations with how they learn from "traditional" classes. Little research has compared how students learn science from computer simulations with how they learn from direct experience in the real environment on which the simulations are based. This study compared two…

  7. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  8. Systems approach to the study of stretch and arrhythmias in right ventricular failure induced in rats by monocrotaline

    PubMed Central

    Benoist, David; Stones, Rachel; Benson, Alan P.; Fowler, Ewan D.; Drinkhill, Mark J.; Hardy, Matthew E.L.; Saint, David A.; Cazorla, Olivier; Bernus, Olivier; White, Ed

    2014-01-01

    We demonstrate the synergistic benefits of using multiple technologies to investigate complex multi-scale biological responses. The combination of reductionist and integrative methodologies can reveal novel insights into mechanisms of action by tracking changes of in vivo phenomena to alterations in protein activity (or vice versa). We have applied this approach to electrical and mechanical remodelling in right ventricular failure caused by monocrotaline-induced pulmonary artery hypertension in rats. We show arrhythmogenic T-wave alternans in the ECG of conscious heart failure animals. Optical mapping of isolated hearts revealed discordant action potential duration (APD) alternans. Potential causes of the arrhythmic substrate; structural remodelling and/or steep APD restitution and dispersion were observed, with specific remodelling of the Right Ventricular Outflow Tract. At the myocyte level, [Ca2+]i transient alternans were observed together with decreased activity, gene and protein expression of the sarcoplasmic reticulum Ca2+-ATPase (SERCA). Computer simulations of the electrical and structural remodelling suggest both contribute to a less stable substrate. Echocardiography was used to estimate increased wall stress in failure, in vivo. Stretch of intact and skinned single myocytes revealed no effect on the Frank-Starling mechanism in failing myocytes. In isolated hearts acute stretch-induced arrhythmias occurred in all preparations. Significant shortening of the early APD was seen in control but not failing hearts. These observations may be linked to changes in the gene expression of candidate mechanosensitive ion channels (MSCs) TREK-1 and TRPC1/6. Computer simulations incorporating MSCs and changes in ion channels with failure, based on altered gene expression, largely reproduced experimental observations. PMID:25016242

  9. Non-local means denoising of dynamic PET images.

    PubMed

    Dutta, Joyita; Leahy, Richard M; Li, Quanzheng

    2013-01-01

    Dynamic positron emission tomography (PET), which reveals information about both the spatial distribution and temporal kinetics of a radiotracer, enables quantitative interpretation of PET data. Model-based interpretation of dynamic PET images by means of parametric fitting, however, is often a challenging task due to high levels of noise, thus necessitating a denoising step. The objective of this paper is to develop and characterize a denoising framework for dynamic PET based on non-local means (NLM). NLM denoising computes weighted averages of voxel intensities assigning larger weights to voxels that are similar to a given voxel in terms of their local neighborhoods or patches. We introduce three key modifications to tailor the original NLM framework to dynamic PET. Firstly, we derive similarities from less noisy later time points in a typical PET acquisition to denoise the entire time series. Secondly, we use spatiotemporal patches for robust similarity computation. Finally, we use a spatially varying smoothing parameter based on a local variance approximation over each spatiotemporal patch. To assess the performance of our denoising technique, we performed a realistic simulation on a dynamic digital phantom based on the Digimouse atlas. For experimental validation, we denoised [Formula: see text] PET images from a mouse study and a hepatocellular carcinoma patient study. We compared the performance of NLM denoising with four other denoising approaches - Gaussian filtering, PCA, HYPR, and conventional NLM based on spatial patches. The simulation study revealed significant improvement in bias-variance performance achieved using our NLM technique relative to all the other methods. The experimental data analysis revealed that our technique leads to clear improvement in contrast-to-noise ratio in Patlak parametric images generated from denoised preclinical and clinical dynamic images, indicating its ability to preserve image contrast and high intensity details while lowering the background noise variance.

  10. Computer-based simulation training in emergency medicine designed in the light of malpractice cases.

    PubMed

    Karakuş, Akan; Duran, Latif; Yavuz, Yücel; Altintop, Levent; Calişkan, Fatih

    2014-07-27

    Using computer-based simulation systems in medical education is becoming more and more common. Although the benefits of practicing with these systems in medical education have been demonstrated, advantages of using computer-based simulation in emergency medicine education are less validated. The aim of the present study was to assess the success rates of final year medical students in doing emergency medical treatment and evaluating the effectiveness of computer-based simulation training in improving final year medical students' knowledge. Twenty four Students trained with computer-based simulation and completed at least 4 hours of simulation-based education between the dates Feb 1, 2010 - May 1, 2010. Also a control group (traditionally trained, n =24) was chosen. After the end of training, students completed an examination about 5 randomized medical simulation cases. In 5 cases, an average of 3.9 correct medical approaches carried out by computer-based simulation trained students, an average of 2.8 correct medical approaches carried out by traditionally trained group (t = 3.90, p < 0.005). We found that the success of students trained with simulation training in cases which required complicated medical approach, was statistically higher than the ones who didn't take simulation training (p ≤ 0.05). Computer-based simulation training would be significantly effective in learning of medical treatment algorithms. We thought that these programs can improve the success rate of students especially in doing adequate medical approach to complex emergency cases.

  11. Permanent bending and alignment of ZnO nanowires.

    PubMed

    Borschel, Christian; Spindler, Susann; Lerose, Damiana; Bochmann, Arne; Christiansen, Silke H; Nietzsche, Sandor; Oertel, Michael; Ronning, Carsten

    2011-05-06

    Ion beams can be used to permanently bend and re-align nanowires after growth. We have irradiated ZnO nanowires with energetic ions, achieving bending and alignment in different directions. Not only the bending of single nanowires is studied in detail, but also the simultaneous alignment of large ensembles of ZnO nanowires. Computer simulations reveal how the bending is initiated by ion beam induced damage. Detailed structural characterization identifies dislocations to relax stresses and make the bending and alignment permanent, even surviving annealing procedures.

  12. Site-specific strong ground motion prediction using 2.5-D modelling

    NASA Astrophysics Data System (ADS)

    Narayan, J. P.

    2001-08-01

    An algorithm was developed using the 2.5-D elastodynamic wave equation, based on the displacement-stress relation. One of the most significant advantages of the 2.5-D simulation is that the 3-D radiation pattern can be generated using double-couple point shear-dislocation sources in the 2-D numerical grid. A parsimonious staggered grid scheme was adopted instead of the standard staggered grid scheme, since this is the only scheme suitable for computing the dislocation. This new 2.5-D numerical modelling avoids the extensive computational cost of 3-D modelling. The significance of this exercise is that it makes it possible to simulate the strong ground motion (SGM), taking into account the energy released, 3-D radiation pattern, path effects and local site conditions at any location around the epicentre. The slowness vector (py) was used in the supersonic region for each layer, so that all the components of the inertia coefficient are positive. The double-couple point shear-dislocation source was implemented in the numerical grid using the moment tensor components as the body-force couples. The moment per unit volume was used in both the 3-D and 2.5-D modelling. A good agreement in the 3-D and 2.5-D responses for different grid sizes was obtained when the moment per unit volume was further reduced by a factor equal to the finite-difference grid size in the case of the 2.5-D modelling. The components of the radiation pattern were computed in the xz-plane using 3-D and 2.5-D algorithms for various focal mechanisms, and the results were in good agreement. A comparative study of the amplitude behaviour of the 3-D and 2.5-D wavefronts in a layered medium reveals the spatial and temporal damped nature of the 2.5-D elastodynamic wave equation. 3-D and 2.5-D simulated responses at a site using a different strike direction reveal that strong ground motion (SGM) can be predicted just by rotating the strike of the fault counter-clockwise by the same amount as the azimuth of the site with respect to the epicentre. This adjustment is necessary since the response is computed keeping the epicentre, focus and the desired site in the same xz-plane, with the x-axis pointing in the north direction.

  13. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  14. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  15. Ultrafast spectroscopy reveals subnanosecond peptide conformational dynamics and validates molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Spörlein, Sebastian; Carstens, Heiko; Satzger, Helmut; Renner, Christian; Behrendt, Raymond; Moroder, Luis; Tavan, Paul; Zinth, Wolfgang; Wachtveitl, Josef

    2002-06-01

    Femtosecond time-resolved spectroscopy on model peptides with built-in light switches combined with computer simulation of light-triggered motions offers an attractive integrated approach toward the understanding of peptide conformational dynamics. It was applied to monitor the light-induced relaxation dynamics occurring on subnanosecond time scales in a peptide that was backbone-cyclized with an azobenzene derivative as optical switch and spectroscopic probe. The femtosecond spectra permit the clear distinguishing and characterization of the subpicosecond photoisomerization of the chromophore, the subsequent dissipation of vibrational energy, and the subnanosecond conformational relaxation of the peptide. The photochemical cis/trans-isomerization of the chromophore and the resulting peptide relaxations have been simulated with molecular dynamics calculations. The calculated reaction kinetics, as monitored by the energy content of the peptide, were found to match the spectroscopic data. Thus we verify that all-atom molecular dynamics simulations can quantitatively describe the subnanosecond conformational dynamics of peptides, strengthening confidence in corresponding predictions for longer time scales.

  16. CFD simulation of copper(II) extraction with TFA in non-dispersive hollow fiber membrane contactors.

    PubMed

    Muhammad, Amir; Younas, Mohammad; Rezakazemi, Mashallah

    2018-04-01

    This study presents computational fluid dynamics (CFD) simulation of dispersion-free liquid-liquid extraction of copper(II) with trifluoroacetylacetone (TFA) in hollow fiber membrane contactor (HFMC). Mass and momentum balance Navier-Stokes equations were coupled to address the transport of copper(II) solute across membrane contactor. Model equations were simulated using COMSOL Multiphysics™. The simulation was run to study the detailed concentration distribution of copper(II) and to investigate the effects of various parameters like membrane characteristics, partition coefficient, and flow configuration on extraction efficiency. Once-through extraction was found to be increased from 10 to 100% when partition coefficient was raised from 1 to 10. Similarly, the extraction efficiency was almost doubled when porosity to tortuosity ratio of membrane was increased from 0.05 to 0.81. Furthermore, the study revealed that CFD can be used as an effective optimization tool for the development of economical membrane-based dispersion-free extraction processes.

  17. Free-energy simulations reveal molecular mechanism for functional switch of a DNA helicase

    PubMed Central

    Ma, Wen; Whitley, Kevin D; Schulten, Klaus

    2018-01-01

    Helicases play key roles in genome maintenance, yet it remains elusive how these enzymes change conformations and how transitions between different conformational states regulate nucleic acid reshaping. Here, we developed a computational technique combining structural bioinformatics approaches and atomic-level free-energy simulations to characterize how the Escherichia coli DNA repair enzyme UvrD changes its conformation at the fork junction to switch its function from unwinding to rezipping DNA. The lowest free-energy path shows that UvrD opens the interface between two domains, allowing the bound ssDNA to escape. The simulation results predict a key metastable 'tilted' state during ssDNA strand switching. By simulating FRET distributions with fluorophores attached to UvrD, we show that the new state is supported quantitatively by single-molecule measurements. The present study deciphers key elements for the 'hyper-helicase' behavior of a mutant and provides an effective framework to characterize directly structure-function relationships in molecular machines. PMID:29664402

  18. Molecular Dynamics Studies of Self-Assembling Biomolecules and DNA-functionalized Gold Nanoparticles

    NASA Astrophysics Data System (ADS)

    Cho, Vince Y.

    This thesis is organized as following. In Chapter 2, we use fully atomistic MD simulations to study the conformation of DNA molecules that link gold nanoparticles to form nanoparticle superlattice crystals. In Chapter 3, we study the self-assembly of peptide amphiphiles (PAs) into a cylindrical micelle fiber by using CGMD simulations. Compared to fully atomistic MD simulations, CGMD simulations prove to be computationally cost-efficient and reasonably accurate for exploring self-assembly, and are used in all subsequent chapters. In Chapter 4, we apply CGMD methods to study the self-assembly of small molecule-DNA hybrid (SMDH) building blocks into well-defined cage-like dimers, and reveal the role of kinetics and thermodynamics in this process. In Chapter 5, we extend the CGMD model for this system and find that the assembly of SMDHs can be fine-tuned by changing parameters. In Chapter 6, we explore superlattice crystal structures of DNA-functionalized gold nanoparticles (DNA-AuNP) with the CGMD model and compare the hybridization.

  19. Free-energy simulations reveal molecular mechanism for functional switch of a DNA helicase.

    PubMed

    Ma, Wen; Whitley, Kevin D; Chemla, Yann R; Luthey-Schulten, Zaida; Schulten, Klaus

    2018-04-17

    Helicases play key roles in genome maintenance, yet it remains elusive how these enzymes change conformations and how transitions between different conformational states regulate nucleic acid reshaping. Here, we developed a computational technique combining structural bioinformatics approaches and atomic-level free-energy simulations to characterize how the Escherichia coli DNA repair enzyme UvrD changes its conformation at the fork junction to switch its function from unwinding to rezipping DNA. The lowest free-energy path shows that UvrD opens the interface between two domains, allowing the bound ssDNA to escape. The simulation results predict a key metastable 'tilted' state during ssDNA strand switching. By simulating FRET distributions with fluorophores attached to UvrD, we show that the new state is supported quantitatively by single-molecule measurements. The present study deciphers key elements for the 'hyper-helicase' behavior of a mutant and provides an effective framework to characterize directly structure-function relationships in molecular machines. © 2018, Ma et al.

  20. Expanding the View of Proton Pumping in Cytochrome c Oxidase through Computer Simulation

    PubMed Central

    Peng, Yuxing; Voth, Gregory A.

    2011-01-01

    In cytochrome c oxidase (CcO), a redox-driven proton pump, protons are transported by the Grotthuss shuttling via hydrogen-bonded water molecules and protonatable residues. Proton transport through the D-pathway is a complicated process that is highly sensitive to alterations in the amino acids or the solvation structure in the channel, both of which can inhibit proton pumping and enzymatic activity. Simulations of proton transport in the hydrophobic cavity showed a clear redox state dependence. To study the mechanism of proton pumping in CcO, multi-state empirical valence bond (MS-EVB) simulations have been conducted, focusing on the proton transport through the D-pathway and the hydrophobic cavity next to the binuclear center. The hydration structures, transport pathways, effects of residues, and free energy surfaces of proton transport were revealed in these MS-EVB simulations. The mechanistic insight gained from them is herein reviewed and placed in context for future studies. PMID:22178790

  1. An infectious way to teach students about outbreaks.

    PubMed

    Cremin, Íde; Watson, Oliver; Heffernan, Alastair; Imai, Natsuko; Ahmed, Norin; Bivegete, Sandra; Kimani, Teresia; Kyriacou, Demetris; Mahadevan, Preveina; Mustafa, Rima; Pagoni, Panagiota; Sophiea, Marisa; Whittaker, Charlie; Beacroft, Leo; Riley, Steven; Fisher, Matthew C

    2018-06-01

    The study of infectious disease outbreaks is required to train today's epidemiologists. A typical way to introduce and explain key epidemiological concepts is through the analysis of a historical outbreak. There are, however, few training options that explicitly utilise real-time simulated stochastic outbreaks where the participants themselves comprise the dataset they subsequently analyse. In this paper, we present a teaching exercise in which an infectious disease outbreak is simulated over a five-day period and subsequently analysed. We iteratively developed the teaching exercise to offer additional insight into analysing an outbreak. An R package for visualisation, analysis and simulation of the outbreak data was developed to accompany the practical to reinforce learning outcomes. Computer simulations of the outbreak revealed deviations from observed dynamics, highlighting how simplifying assumptions conventionally made in mathematical models often differ from reality. Here we provide a pedagogical tool for others to use and adapt in their own settings. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  3. A Computer-Based Simulation of an Acid-Base Titration

    ERIC Educational Resources Information Center

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  4. Using high-performance networks to enable computational aerosciences applications

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1992-01-01

    One component of the U.S. Federal High Performance Computing and Communications Program (HPCCP) is the establishment of a gigabit network to provide a communications infrastructure for researchers across the nation. This gigabit network will provide new services and capabilities, in addition to increased bandwidth, to enable future applications. An understanding of these applications is necessary to guide the development of the gigabit network and other high-performance networks of the future. In this paper we focus on computational aerosciences applications run remotely using the Numerical Aerodynamic Simulation (NAS) facility located at NASA Ames Research Center. We characterize these applications in terms of network-related parameters and relate user experiences that reveal limitations imposed by the current wide-area networking infrastructure. Then we investigate how the development of a nationwide gigabit network would enable users of the NAS facility to work in new, more productive ways.

  5. Black Hole Mergers and Gravitational Waves: Opening the New Frontier

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2012-01-01

    The final merger of two black holes produces a powerful burst of gravitational waves, emitting more energy than all the stars in the observable universe combined. Since these mergers take place in the regime of strong dynamical gravity, computing the gravitational waveforms requires solving the full Einstein equations of general relativity on a computer. For more than 30 years, scientists tried to simulate these mergers using the methods of numerical relativity. The resulting computer codes were plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. In the past several years, this situation has changed dramatically, with a series of remarkable breakthroughs. This talk will highlight these breakthroughs and the resulting 'gold rush' of new results that is revealing the dynamics of binary black hole mergers, and their applications in gravitational wave detection, testing general relativity, and astrophysics.

  6. Effect of brushing and thermocycling on the shade and surface roughness of CAD-CAM ceramic restorations.

    PubMed

    Yuan, Judy Chia-Chun; Barão, Valentim Adelino Ricardo; Wee, Alvin G; Alfaro, Maria F; Afshari, Fatemeh S; Sukotjo, Cortino

    2017-09-29

    The effects of toothbrushing (B) and thermocycling (TC) on the surface texture of different materials with various fabrication processes have been investigated. However, studies of computer-aided design and computer-aided manufacturing (CAD-CAM) ceramic restorations are limited. The purpose of this in vitro study was to evaluate the effect of B and TC on the color stability and surface roughness of extrinsically characterized and glazed CAD-CAM ceramic restorations. Lithium disilicate CAD ceramic (n=90) and zirconia ceramic (n=90) were studied. All specimens were crystallized/sintered, characterized, and glazed following the manufacturer's recommendation. The specimens were divided into 9 different groups: B, TC, and a combination of B plus TC (B+TC). Brushing was performed at 50 000, 100 000, and 150 000 cycles, simulating an oral environment of 5, 10, and 15 years. Thermocycling was performed at 6000, 12 000, and 18 000 cycles, simulating an oral environment of 5, 10, and 15 years. Brushing plus TC was performed with the combination of the 50 000 cycles of B, then 6000 cycles of TC, and 10 000 cycles of B, then 12 000 cycles of TC, and 15 000 cycles of B, then 18 000 cycles of TC. The color and surface roughness of each specimen were measured before and after all interventions with simulated cycles. Color differences (ΔE) and surface roughness (ΔR a ) data were analyzed using 2-way ANOVA, followed by the least significant difference test (α=.05). The correlation between ΔE and ΔR a was statistically analyzed using the Pearson correlation analysis. Within the lithium disilicate CAD groups, intervention did not result in any significant differences in color change (P>.05). Within the zirconia groups, a 15-year clinical simulation revealed significantly higher ΔE values than a simulated 5-year exposure (P=.017). Increased simulated cycles showed significantly higher R a values for all groups. Within the zirconia groups, B revealed significantly smoother surfaces than TC (P<.001) and B+TC interventions (P<.001). For the zirconia, simulating B+TC for15 years revealed significantly higher R a values than the groups of B+TC for 5 years (P<.001) and B+TC for 10 years (P=.003). No correlation (lithium disilicate CAD, r=.079; P=.462; zirconia, r=.001; P=.989) was found between the color change and surface roughness. For both lithium disilicate CAD and zirconia, color changes were below the selected clinical perceptible threshold (ΔE=2.6) after all intervention and simulated cycles. All mean surface roughness measurements were below 0.2 μm. Generally, the surface of both lithium disilicate CAD and zirconia became rougher. No correlation was found between color difference and surface roughness for either material. Published by Elsevier Inc.

  7. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G}more » for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient parallel computation of particle and fluid dynamics simulations. These problems occur throughout DOE, military and commercial sectors: the potential payoff is high. We plan to license or sell the solution to contractors for military and domestic applications such as disaster simulation (aerodynamic and hydrodynamic), Government agencies (hydrological and environmental simulations), and medical applications (e.g., in tomographic image reconstruction). Keywords - High-performance Computing, Graphic Processing Unit, Fluid/Particle Simulation. Summary for Members of Congress - Department of Energy has many simulation codes that must compute faster, to be effective. The Phase I research parallelized particle/fluid simulations for rocket combustion, for high-performance computing systems.« less

  8. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less

  9. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    PubMed

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  10. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines

    PubMed Central

    Lien, Fue-Sang

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz. PMID:28378012

  11. Merging Black Holes

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2012-01-01

    The final merger of two black holes is expected to be the strongest source of gravitational waves for both ground-based detectors such as LIGO and VIRGO, as well as future. space-based detectors. Since the merger takes place in the regime of strong dynamical gravity, computing the resulting gravitational waveforms requires solving the full Einstein equations of general relativity on a computer. For many years, numerical codes designed to simulate black hole mergers were plagued by a host of instabilities. However, recent breakthroughs have conquered these instabilities and opened up this field dramatically. This talk will focus on.the resulting 'gold rush' of new results that is revealing the dynamics and waveforms of binary black hole mergers, and their applications in gravitational wave detection, testing general relativity, and astrophysics

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This journal contains 7 articles pertaining to astrophysics. The first article is an overview of the other 6 articles and also a tribute to Jim Wilson and his work in the fields of general relativity and numerical astrophysics. The six articles are on the following subjects: (1) computer simulations of black hole accretion; (2) calculations on the collapse of the iron core of a massive star; (3) stellar-collapse models which reveal a possible site for nucleosynthesis of elements heavier than iron; (4) modeling sources for gravitational radiation; (5) the development of a computer program for finite-difference mesh calculations and itsmore » applications to astrophysics; (6) the existence of neutrinos with nonzero rest mass are used to explain the universe. Abstracts of each of the articles were prepared separately. (SC)« less

  13. Merging Black Holes

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2010-01-01

    The final merger of two black holes is expected to be the strongest source of gravitational waves for both ground-based detectors such as LIGO and VIRGO, as well as the space-based LISA. Since the merger takes place in the regime of strong dynamical gravity, computing the resulting gravitational waveforms requires solving the full Einstein equations of general relativity on a computer. For many years, numerical codes designed to simulate black hole mergers were plagued by a host of instabilities. However, recent breakthroughs have conquered these instabilities and opened up this field dramatically. This talk will focus on the resulting gold rush of new results that are revealing the dynamics and waveforms of binary black hole mergers, and their applications in gravitational wove detection, testing general relativity, and astrophysics.

  14. Merging Black Holes

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2010-01-01

    The final merger of two black holes is expected to be the strongest source of gravitational waves for both ground-based detectors such as LIGO and VIRGO, as well as the space-based LISA. Since the merger takes place in the regime of strong dynamical gravity, computing the resulting gravitational waveforms requires solving the full Einstein equations of general relativity on a computer. For many years, numerical codes designed to simulate black hole mergers were plagued by a host of instabilities. However, recent breakthroughs have conquered these instabilities and opened up this field dramatically. This talk will focus on the resulting gold rush of new results that are revealing the dynamics and waveforms of binary black hole mergers, and their applications in gravitational wave detection, testing general relativity, and astrophysics.

  15. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

    PubMed

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-08

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  16. Computing Cosmic Cataclysms

    NASA Technical Reports Server (NTRS)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  17. Computational Fluid Dynamics Demonstration of Rigid Bodies in Motion

    NASA Technical Reports Server (NTRS)

    Camarena, Ernesto; Vu, Bruce T.

    2011-01-01

    The Design Analysis Branch (NE-Ml) at the Kennedy Space Center has not had the ability to accurately couple Rigid Body Dynamics (RBD) and Computational Fluid Dynamics (CFD). OVERFLOW-D is a flow solver that has been developed by NASA to have the capability to analyze and simulate dynamic motions with up to six Degrees of Freedom (6-DOF). Two simulations were prepared over the course of the internship to demonstrate 6DOF motion of rigid bodies under aerodynamic loading. The geometries in the simulations were based on a conceptual Space Launch System (SLS). The first simulation that was prepared and computed was the motion of a Solid Rocket Booster (SRB) as it separates from its core stage. To reduce computational time during the development of the simulation, only half of the physical domain with respect to the symmetry plane was simulated. Then a full solution was prepared and computed. The second simulation was a model of the SLS as it departs from a launch pad under a 20 knot crosswind. This simulation was reduced to Two Dimensions (2D) to reduce both preparation and computation time. By allowing 2-DOF for translations and 1-DOF for rotation, the simulation predicted unrealistic rotation. The simulation was then constrained to only allow translations.

  18. Computational Design of Graphene Nanoscrolls

    NASA Astrophysics Data System (ADS)

    Bejagam, Karteek; Singh, Samrendra; Deshmukh, Sanket; Deshmukh Group Team; Samrendra Group Collaboration

    Graphene nanoscrolls have obtained a significant interest in recent years due to their potential applications in tribology, nanotechnology, and bioengineering. For example, recently it has been shown that graphene nanoscrolls can be used to experience superlubricity - almost zero friction state. In the present study we employ the metal/non-metal nanoparticles to facilitate the graphene nanoscroll formation. We have conducted reactive molecular dynamics (RMD) simulations of diamond, Nickel, and Gold nanoparticles placed on the 2D graphene sheet. RMD simulations reveal the mechanisms that facilitates or prohibits the graphene nanoscroll formation. Our simulations suggest that the surface chemistry and interactions between nanoparticles and graphene play a crucial in determining the mechanism of scroll formation and the nature of the nanoscroll. We also find that the type of a nanoparticle has strong influence on the elastic and mechanical properties of the nanoscroll. Our study provides a systematic pathway to design graphene nanoscrolls with a wide range of properties.

  19. Vibrational, spectroscopic, molecular docking and density functional theory studies on N-(5-aminopyridin-2-yl)acetamide

    NASA Astrophysics Data System (ADS)

    Asath, R. Mohamed; Rekha, T. N.; Premkumar, S.; Mathavan, T.; Benial, A. Milton Franklin

    2016-12-01

    Conformational analysis was carried out for N-(5-aminopyridin-2-yl)acetamide (APA) molecule. The most stable, optimized structure was predicted by the density functional theory calculations using the B3LYP functional with cc-pVQZ basis set. The optimized structural parameters and vibrational frequencies were calculated. The experimental and theoretical vibrational frequencies were assigned and compared. Ultraviolet-visible spectrum was simulated and validated experimentally. The molecular electrostatic potential surface was simulated. Frontier molecular orbitals and related molecular properties were computed, which reveals that the higher molecular reactivity and stability of the APA molecule and further density of states spectrum was simulated. The natural bond orbital analysis was also performed to confirm the bioactivity of the APA molecule. Antidiabetic activity was studied based on the molecular docking analysis and the APA molecule was identified that it can act as a good inhibitor against diabetic nephropathy.

  20. Worm Algorithm simulations of the hole dynamics in the t-J model

    NASA Astrophysics Data System (ADS)

    Prokof'ev, Nikolai; Ruebenacker, Oliver

    2001-03-01

    In the limit of small J << t, relevant for HTSC materials and Mott-Hubbard systems, computer simulations have to be performed for large systems and at low temperatures. Despite convincing evidence against spin-charge separation obtained by various methods for J > 0.4t there is an ongoing argument that at smaller J spin-charge separation is still possible. Worm algorithm Monte Carlo simulations of the hole Green function for 0.1 < J/t < 0.4 were performed on lattices with up to 32x32 sites, and at temperature J/T = 40 (for the largest size). Spectral analysis reveals a single, delta-function sharp quasiparticle peak at the lowest edge of the spectrum and two distinct peaks above it at all studied J. We rule out the possibility of spin-charge separation in this parameter range, and present, apparently, the hole spectral function in the thermodynamic limit.

  1. Internal force corrections with machine learning for quantum mechanics/molecular mechanics simulations.

    PubMed

    Wu, Jingheng; Shen, Lin; Yang, Weitao

    2017-10-28

    Ab initio quantum mechanics/molecular mechanics (QM/MM) molecular dynamics simulation is a useful tool to calculate thermodynamic properties such as potential of mean force for chemical reactions but intensely time consuming. In this paper, we developed a new method using the internal force correction for low-level semiempirical QM/MM molecular dynamics samplings with a predefined reaction coordinate. As a correction term, the internal force was predicted with a machine learning scheme, which provides a sophisticated force field, and added to the atomic forces on the reaction coordinate related atoms at each integration step. We applied this method to two reactions in aqueous solution and reproduced potentials of mean force at the ab initio QM/MM level. The saving in computational cost is about 2 orders of magnitude. The present work reveals great potentials for machine learning in QM/MM simulations to study complex chemical processes.

  2. Dosimetry in MARS spectral CT: TOPAS Monte Carlo simulations and ion chamber measurements.

    PubMed

    Lu, Gray; Marsh, Steven; Damet, Jerome; Carbonez, Pierre; Laban, John; Bateman, Christopher; Butler, Anthony; Butler, Phil

    2017-06-01

    Spectral computed tomography (CT) is an up and coming imaging modality which shows great promise in revealing unique diagnostic information. Because this imaging modality is based on X-ray CT, it is of utmost importance to study the radiation dose aspects of its use. This study reports on the implementation and evaluation of a Monte Carlo simulation tool using TOPAS for estimating dose in a pre-clinical spectral CT scanner known as the MARS scanner. Simulated estimates were compared with measurements from an ionization chamber. For a typical MARS scan, TOPAS estimated for a 30 mm diameter cylindrical phantom a CT dose index (CTDI) of 29.7 mGy; CTDI was measured by ion chamber to within 3% of TOPAS estimates. Although further development is required, our investigation of TOPAS for estimating MARS scan dosimetry has shown its potential for further study of spectral scanning protocols and dose to scanned objects.

  3. Molecular dynamics simulation of premelting and melting phase transitions in stoichiometric uranium dioxide

    NASA Astrophysics Data System (ADS)

    Yakub, Eugene; Ronchi, Claudio; Staicu, Dragos

    2007-09-01

    Results of molecular dynamics (MD) simulation of UO2 in a wide temperature range are presented and discussed. A new approach to the calibration of a partly ionic Busing-Ida-type model is proposed. A potential parameter set is obtained reproducing the experimental density of solid UO2 in a wide range of temperatures. A conventional simulation of the high-temperature stoichiometric UO2 on large MD cells, based on a novel fast method of computation of Coulomb forces, reveals characteristic features of a premelting λ transition at a temperature near to that experimentally observed (Tλ=2670K ). A strong deviation from the Arrhenius behavior of the oxygen self-diffusion coefficient was found in the vicinity of the transition point. Predictions for liquid UO2, based on the same potential parameter set, are in good agreement with existing experimental data and theoretical calculations.

  4. Unraveling the Geometry Dependence of In-Nozzle Cavitation in High-Pressure Injectors

    PubMed Central

    Im, Kyoung-Su; Cheong, Seong-Kyun; Powell, Christopher F.; Lai, Ming-chia D.; Wang, Jin

    2013-01-01

    Cavitation is an intricate multiphase phenomenon that interplays with turbulence in fluid flows. It exhibits clear duality in characteristics, being both destructive and beneficial in our daily lives and industrial processes. Despite the multitude of occurrences of this phenomenon, highly dynamic and multiphase cavitating flows have not been fundamentally well understood in guiding the effort to harness the transient and localized power generated by this process. In a microscale, multiphase flow liquid injection system, we synergistically combined experiments using time-resolved x-radiography and a novel simulation method to reveal the relationship between the injector geometry and the in-nozzle cavitation quantitatively. We demonstrate that a slight alteration of the geometry on the micrometer scale can induce distinct laminar-like or cavitating flows, validating the multiphase computational fluid dynamics simulation. Furthermore, the simulation identifies a critical geometric parameter with which the high-speed flow undergoes an intriguing transition from non-cavitating to cavitating. PMID:23797665

  5. Arrays of individually controlled ions suitable for two-dimensional quantum simulations

    PubMed Central

    Mielenz, Manuel; Kalis, Henning; Wittemer, Matthias; Hakelberg, Frederick; Warring, Ulrich; Schmied, Roman; Blain, Matthew; Maunz, Peter; Moehring, David L.; Leibfried, Dietrich; Schaetz, Tobias

    2016-01-01

    A precisely controlled quantum system may reveal a fundamental understanding of another, less accessible system of interest. A universal quantum computer is currently out of reach, but an analogue quantum simulator that makes relevant observables, interactions and states of a quantum model accessible could permit insight into complex dynamics. Several platforms have been suggested and proof-of-principle experiments have been conducted. Here, we operate two-dimensional arrays of three trapped ions in individually controlled harmonic wells forming equilateral triangles with side lengths 40 and 80 μm. In our approach, which is scalable to arbitrary two-dimensional lattices, we demonstrate individual control of the electronic and motional degrees of freedom, preparation of a fiducial initial state with ion motion close to the ground state, as well as a tuning of couplings between ions within experimental sequences. Our work paves the way towards a quantum simulator of two-dimensional systems designed at will. PMID:27291425

  6. Direct numerical simulation of the laminar-turbulent transition at hypersonic flow speeds on a supercomputer

    NASA Astrophysics Data System (ADS)

    Egorov, I. V.; Novikov, A. V.; Fedorov, A. V.

    2017-08-01

    A method for direct numerical simulation of three-dimensional unsteady disturbances leading to a laminar-turbulent transition at hypersonic flow speeds is proposed. The simulation relies on solving the full three-dimensional unsteady Navier-Stokes equations. The computational technique is intended for multiprocessor supercomputers and is based on a fully implicit monotone approximation scheme and the Newton-Raphson method for solving systems of nonlinear difference equations. This approach is used to study the development of three-dimensional unstable disturbances in a flat-plate and compression-corner boundary layers in early laminar-turbulent transition stages at the free-stream Mach number M = 5.37. The three-dimensional disturbance field is visualized in order to reveal and discuss features of the instability development at the linear and nonlinear stages. The distribution of the skin friction coefficient is used to detect laminar and transient flow regimes and determine the onset of the laminar-turbulent transition.

  7. Study of ceramic products and processing techniques in space. [using computerized simulation

    NASA Technical Reports Server (NTRS)

    Markworth, A. J.; Oldfield, W.

    1974-01-01

    An analysis of the solidification kinetics of beta alumina in a zero-gravity environment was carried out, using computer-simulation techniques, in order to assess the feasibility of producing high-quality single crystals of this material in space. The two coupled transport processes included were movement of the solid-liquid interface and diffusion of sodium atoms in the melt. Results of the simulation indicate that appreciable crystal-growth rates can be attained in space. Considerations were also made of the advantages offered by high-quality single crystals of beta alumina for use as a solid electrolyte; these clearly indicate that space-grown materials are superior in many respects to analogous terrestrially-grown crystals. Likewise, economic considerations, based on the rapidly expanding technological applications for beta alumina and related fast ionic conductors, reveal that the many superior qualities of space-grown material justify the added expense and experimental detail associated with space processing.

  8. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  9. Simulation Accelerator

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under a NASA SBIR (Small Business Innovative Research) contract, (NAS5-30905), EAI Simulation Associates, Inc., developed a new digital simulation computer, Starlight(tm). With an architecture based on the analog model of computation, Starlight(tm) outperforms all other computers on a wide range of continuous system simulation. This system is used in a variety of applications, including aerospace, automotive, electric power and chemical reactors.

  10. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  11. The Role of Computer Simulation in an Inquiry-Based Learning Environment: Reconstructing Geological Events as Geologists

    ERIC Educational Resources Information Center

    Lin, Li-Fen; Hsu, Ying-Shao; Yeh, Yi-Fen

    2012-01-01

    Several researchers have investigated the effects of computer simulations on students' learning. However, few have focused on how simulations with authentic contexts influences students' inquiry skills. Therefore, for the purposes of this study, we developed a computer simulation (FossilSim) embedded in an authentic inquiry lesson. FossilSim…

  12. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  13. Organization and Dynamics of Receptor Proteins in a Plasma Membrane.

    PubMed

    Koldsø, Heidi; Sansom, Mark S P

    2015-11-25

    The interactions of membrane proteins are influenced by their lipid environment, with key lipid species able to regulate membrane protein function. Advances in high-resolution microscopy can reveal the organization and dynamics of proteins and lipids within living cells at resolutions <200 nm. Parallel advances in molecular simulations provide near-atomic-resolution models of the dynamics of the organization of membranes of in vivo-like complexity. We explore the dynamics of proteins and lipids in crowded and complex plasma membrane models, thereby closing the gap in length and complexity between computations and experiments. Our simulations provide insights into the mutual interplay between lipids and proteins in determining mesoscale (20-100 nm) fluctuations of the bilayer, and in enabling oligomerization and clustering of membrane proteins.

  14. Statistical mechanics of a cat's cradle

    NASA Astrophysics Data System (ADS)

    Shen, Tongye; Wolynes, Peter G.

    2006-11-01

    It is believed that, much like a cat's cradle, the cytoskeleton can be thought of as a network of strings under tension. We show that both regular and random bond-disordered networks having bonds that buckle upon compression exhibit a variety of phase transitions as a function of temperature and extension. The results of self-consistent phonon calculations for the regular networks agree very well with computer simulations at finite temperature. The analytic theory also yields a rigidity onset (mechanical percolation) and the fraction of extended bonds for random networks. There is very good agreement with the simulations by Delaney et al (2005 Europhys. Lett. 72 990). The mean field theory reveals a nontranslationally invariant phase with self-generated heterogeneity of tautness, representing 'antiferroelasticity'.

  15. Dynamic transition in the structure of an energetic crystal during chemical reactions at shock front prior to detonation.

    PubMed

    Nomura, Ken-Ichi; Kalia, Rajiv K; Nakano, Aiichiro; Vashishta, Priya; van Duin, Adri C T; Goddard, William A

    2007-10-05

    Mechanical stimuli in energetic materials initiate chemical reactions at shock fronts prior to detonation. Shock sensitivity measurements provide widely varying results, and quantum-mechanical calculations are unable to handle systems large enough to describe shock structure. Recent developments in reactive force-field molecular dynamics (ReaxFF-MD) combined with advances in parallel computing have paved the way to accurately simulate reaction pathways along with the structure of shock fronts. Our multimillion-atom ReaxFF-MD simulations of l,3,5-trinitro-l,3,5-triazine (RDX) reveal that detonation is preceded by a transition from a diffuse shock front with well-ordered molecular dipoles behind it to a disordered dipole distribution behind a sharp front.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bischoff, A. J., E-mail: alina.bischoff@iom-leipzig.de; Arabi-Hashemi, A.; Ehrhardt, M.

    Combining experimental methods and classical molecular dynamics (MD) computer simulations, we explore the martensitic transformation in Fe{sub 70}Pd{sub 30} ferromagnetic shape memory alloy thin films induced by laser shock peening. X-ray diffraction and scanning electron microscope measurements at shock wave pressures of up to 2.5 GPa reveal formation of martensitic variants with preferred orientation of the shorter c-axis of the tetragonal unit cell perpendicular to the surface plane. Moreover, consequential merging of growth islands on the film surface is observed. MD simulations unveil the underlying physics that are characterized by an austenite-martensite transformation with a preferential alignment of the c-axis alongmore » the propagation direction of the shock wave, resulting in flattening and in-plane expansion of surface features.« less

  17. A virtual surgical training system that simulates cutting of soft tissue using a modified pre-computed elastic model.

    PubMed

    Toe, Kyaw Kyar; Huang, Weimin; Yang, Tao; Duan, Yuping; Zhou, Jiayin; Su, Yi; Teo, Soo-Kng; Kumar, Selvaraj Senthil; Lim, Calvin Chi-Wan; Chui, Chee Kong; Chang, Stephen

    2015-08-01

    This work presents a surgical training system that incorporates cutting operation of soft tissue simulated based on a modified pre-computed linear elastic model in the Simulation Open Framework Architecture (SOFA) environment. A precomputed linear elastic model used for the simulation of soft tissue deformation involves computing the compliance matrix a priori based on the topological information of the mesh. While this process may require a few minutes to several hours, based on the number of vertices in the mesh, it needs only to be computed once and allows real-time computation of the subsequent soft tissue deformation. However, as the compliance matrix is based on the initial topology of the mesh, it does not allow any topological changes during simulation, such as cutting or tearing of the mesh. This work proposes a way to modify the pre-computed data by correcting the topological connectivity in the compliance matrix, without re-computing the compliance matrix which is computationally expensive.

  18. Round Robin Study: Molecular Simulation of Thermodynamic Properties from Models with Internal Degrees of Freedom.

    PubMed

    Schappals, Michael; Mecklenfeld, Andreas; Kröger, Leif; Botan, Vitalie; Köster, Andreas; Stephan, Simon; García, Edder J; Rutkai, Gabor; Raabe, Gabriele; Klein, Peter; Leonhard, Kai; Glass, Colin W; Lenhard, Johannes; Vrabec, Jadran; Hasse, Hans

    2017-09-12

    Thermodynamic properties are often modeled by classical force fields which describe the interactions on the atomistic scale. Molecular simulations are used for retrieving thermodynamic data from such models, and many simulation techniques and computer codes are available for that purpose. In the present round robin study, the following fundamental question is addressed: Will different user groups working with different simulation codes obtain coinciding results within the statistical uncertainty of their data? A set of 24 simple simulation tasks is defined and solved by five user groups working with eight molecular simulation codes: DL_POLY, GROMACS, IMC, LAMMPS, ms2, NAMD, Tinker, and TOWHEE. Each task consists of the definition of (1) a pure fluid that is described by a force field and (2) the conditions under which that property is to be determined. The fluids are four simple alkanes: ethane, propane, n-butane, and iso-butane. All force fields consider internal degrees of freedom: OPLS, TraPPE, and a modified OPLS version with bond stretching vibrations. Density and potential energy are determined as a function of temperature and pressure on a grid which is specified such that all states are liquid. The user groups worked independently and reported their results to a central instance. The full set of results was disclosed to all user groups only at the end of the study. During the study, the central instance gave only qualitative feedback. The results reveal the challenges of carrying out molecular simulations. Several iterations were needed to eliminate gross errors. For most simulation tasks, the remaining deviations between the results of the different groups are acceptable from a practical standpoint, but they are often outside of the statistical errors of the individual simulation data. However, there are also cases where the deviations are unacceptable. This study highlights similarities between computer experiments and laboratory experiments, which are both subject not only to statistical error but also to systematic error.

  19. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  20. Impact of graphene-based nanomaterials (GBNMs) on the structural and functional conformations of hepcidin peptide

    NASA Astrophysics Data System (ADS)

    Singh, Krishna P.; Baweja, Lokesh; Wolkenhauer, Olaf; Rahman, Qamar; Gupta, Shailendra K.

    2018-03-01

    Graphene-based nanomaterials (GBNMs) are widely used in various industrial and biomedical applications. GBNMs of different compositions, size and shapes are being introduced without thorough toxicity evaluation due to the unavailability of regulatory guidelines. Computational toxicity prediction methods are used by regulatory bodies to quickly assess health hazards caused by newer materials. Due to increasing demand of GBNMs in various size and functional groups in industrial and consumer based applications, rapid and reliable computational toxicity assessment methods are urgently needed. In the present work, we investigate the impact of graphene and graphene oxide nanomaterials on the structural conformations of small hepcidin peptide and compare the materials for their structural and conformational changes. Our molecular dynamics simulation studies revealed conformational changes in hepcidin due to its interaction with GBMNs, which results in a loss of its functional properties. Our results indicate that hepcidin peptide undergo severe structural deformations when superimposed on the graphene sheet in comparison to graphene oxide sheet. These observations suggest that graphene is more toxic than a graphene oxide nanosheet of similar area. Overall, this study indicates that computational methods based on structural deformation, using molecular dynamics (MD) simulations, can be used for the early evaluation of toxicity potential of novel nanomaterials.

  1. How does ytterbium chloride interact with DMPC bilayers? A computational and experimental study.

    PubMed

    Gonzalez, Miguel A; Barriga, Hanna M G; Richens, Joanna L; Law, Robert V; O'Shea, Paul; Bresme, Fernando

    2017-03-29

    Lanthanide salts have been studied for many years, primarily in Nuclear Magnetic Resonance (NMR) experiments of mixed lipid-protein systems and more recently to study lipid flip-flop in model membrane systems. It is well recognised that lanthanide salts can influence the behaviour of both lipid and protein systems, however a full molecular level description of lipid-lanthanide interactions is still outstanding. Here we present a study of lanthanide-bilayer interactions, using molecular dynamics computer simulations, fluorescence electrostatic potential experiments and nuclear magnetic resonance. Computer simulations reveal the microscopic structure of DMPC lipid bilayers in the presence of Yb 3+ , and a surprising ability of the membranes to adsorb significant concentrations of Yb 3+ without disrupting the overall membrane structure. At concentrations commonly used in NMR experiments, Yb 3+ ions bind strongly to 5 lipids, inducing a small decrease of the area per lipid and a slight increase of the ordering of the aliphatic chains and the bilayer thickness. The area compressibility modulus increases by a factor of two, with respect to the free-salt case, showing that Yb 3+ ions make the bilayer more rigid. These modifications of the bilayer properties should be taken into account in the interpretation of NMR experiments.

  2. How to identify dislocations in molecular dynamics simulations?

    NASA Astrophysics Data System (ADS)

    Li, Duo; Wang, FengChao; Yang, ZhenYu; Zhao, YaPu

    2014-12-01

    Dislocations are of great importance in revealing the underlying mechanisms of deformed solid crystals. With the development of computational facilities and technologies, the observations of dislocations at atomic level through numerical simulations are permitted. Molecular dynamics (MD) simulation suggests itself as a powerful tool for understanding and visualizing the creation of dislocations as well as the evolution of crystal defects. However, the numerical results from the large-scale MD simulations are not very illuminating by themselves and there exist various techniques for analyzing dislocations and the deformed crystal structures. Thus, it is a big challenge for the beginners in this community to choose a proper method to start their investigations. In this review, we summarized and discussed up to twelve existing structure characterization methods in MD simulations of deformed crystal solids. A comprehensive comparison was made between the advantages and disadvantages of these typical techniques. We also examined some of the recent advances in the dynamics of dislocations related to the hydraulic fracturing. It was found that the dislocation emission has a significant effect on the propagation and bifurcation of the crack tip in the hydraulic fracturing.

  3. Using Wavelet Analysis To Assist in Identification of Significant Events in Molecular Dynamics Simulations.

    PubMed

    Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E

    2016-07-25

    Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired.

  4. What the multiline signal (MLS) simulation data with average of weighted computations reveal about the Mn hyperfine interactions and oxidation states of the manganese cluster in OEC?

    NASA Astrophysics Data System (ADS)

    Baituti, Bernard

    2017-11-01

    Understanding the structure of oxygen evolving complex (OEC) fully still remains a challenge. Lately computational chemistry with the data from more detailed X-ray diffraction (XRD) OEC structure, has been used extensively in exploring the mechanisms of water oxidation in the OEC (Gatt et al., J. Photochem. Photobiol. B 104(1-2), 80-93 2011). Knowledge of the oxidation states is very crucial for understanding the core principles of catalysis by photosystem II (PSII) and catalytic mechanism of OEC. The present study involves simulation studies of the X-band continuous wave electron-magnetic resonance (CW-EPR) generated S 2 state signals, to investigate whether the data is in agreement with the four manganese ions in the OEC, being organised as a `3 + 1' (trimer plus one) model (Gatt et al., Angew. Chem. Int. Ed. 51, 12025-12028 2012; Petrie et al., Chem. A Eur. J. 21, 6780-6792 2015; Terrett et al., Chem. Commun. (Camb.) 50, 8-11 2014) or `dimer of dimers' model (Terrett et al. 2016). The question that still remains is how much does each Mn ion contribute to the " g2multiline" signal through its hyperfine interactions in OEC also to differentiate between the `high oxidation state (HOS)' and `low oxidation state (LOS)' paradigms? This is revealed in part by the structure of multiline (ML) signal studied in this project. Two possibilities have been proposed for the redox levels of the Mn ions within the catalytic cluster, the so called `HOS' and `LOS' paradigms (Gatt et al., J. Photochem. Photobiol. B 104(1-2), 80-93 2011). The method of data analysis involves numerical simulations of the experimental spectra on relevant models of the OEC cluster. The simulations of the X-band CW-EPR multiline spectra, revealed three manganese ions having hyperfine couplings with large anisotropy. These are most likely Mn III centres and these clearly support the `LOS' OEC paradigm model, with a mean oxidation of 3.25 in the S2 state. This is consistent with the earlier data by Jin et al. (Phys. Chem. Chem. Phys. (PCCP) 16(17), 7799-812 2014), but the present results clearly indicate that heterogeneity in hyperfine couplings exist in samples as typically prepared.

  5. Does footprint depth correlate with foot motion and pressure?

    PubMed Central

    Bates, K. T.; Savage, R.; Pataky, T. C.; Morse, S. A.; Webster, E.; Falkingham, P. L.; Ren, L.; Qian, Z.; Collins, D.; Bennett, M. R.; McClymont, J.; Crompton, R. H.

    2013-01-01

    Footprints are the most direct source of evidence about locomotor biomechanics in extinct vertebrates. One of the principal suppositions underpinning biomechanical inferences is that footprint geometry correlates with dynamic foot pressure, which, in turn, is linked with overall limb motion of the trackmaker. In this study, we perform the first quantitative test of this long-standing assumption, using topological statistical analysis of plantar pressures and experimental and computer-simulated footprints. In computer-simulated footprints, the relative distribution of depth differed from the distribution of both peak and pressure impulse in all simulations. Analysis of footprint samples with common loading inputs and similar depths reveals that only shallow footprints lack significant topological differences between depth and pressure distributions. Topological comparison of plantar pressures and experimental beach footprints demonstrates that geometry is highly dependent on overall print depth; deeper footprints are characterized by greater relative forefoot, and particularly toe, depth than shallow footprints. The highlighted difference between ‘shallow’ and ‘deep’ footprints clearly emphasizes the need to understand variation in foot mechanics across different degrees of substrate compliance. Overall, our results indicate that extreme caution is required when applying the ‘depth equals pressure’ paradigm to hominin footprints, and by extension, those of other extant and extinct tetrapods. PMID:23516064

  6. WarpEngine, a Flexible Platform for Distributed Computing Implemented in the VEGA Program and Specially Targeted for Virtual Screening Studies.

    PubMed

    Pedretti, Alessandro; Mazzolari, Angelica; Vistoli, Giulio

    2018-05-21

    The manuscript describes WarpEngine, a novel platform implemented within the VEGA ZZ suite of software for performing distributed simulations both in local and wide area networks. Despite being tailored for structure-based virtual screening campaigns, WarpEngine possesses the required flexibility to carry out distributed calculations utilizing various pieces of software, which can be easily encapsulated within this platform without changing their source codes. WarpEngine takes advantages of all cheminformatics features implemented in the VEGA ZZ program as well as of its largely customizable scripting architecture thus allowing an efficient distribution of various time-demanding simulations. To offer an example of the WarpEngine potentials, the manuscript includes a set of virtual screening campaigns based on the ACE data set of the DUD-E collections using PLANTS as the docking application. Benchmarking analyses revealed a satisfactory linearity of the WarpEngine performances, the speed-up values being roughly equal to the number of utilized cores. Again, the computed scalability values emphasized that a vast majority (i.e., >90%) of the performed simulations benefit from the distributed platform presented here. WarpEngine can be freely downloaded along with the VEGA ZZ program at www.vegazz.net .

  7. Unveiling Stability Criteria of DNA-Carbon Nanotubes Constructs by Scanning Tunneling Microscopy and Computational Modeling

    DOE PAGES

    Kilina, Svetlana; Yarotski, Dzmitry A.; Talin, A. Alec; ...

    2011-01-01

    We present a combined approach that relies on computational simulations and scanning tunneling microscopy (STM) measurements to reveal morphological properties and stability criteria of carbon nanotube-DNA (CNT-DNA) constructs. Application of STM allows direct observation of very stable CNT-DNA hybrid structures with the well-defined DNA wrapping angle of 63.4 ° and a coiling period of 3.3 nm. Using force field simulations, we determine how the DNA-CNT binding energy depends on the sequence and binding geometry of a single strand DNA. This dependence allows us to quantitatively characterize the stability of a hybrid structure with an optimal π-stacking between DNA nucleotides and themore » tube surface and better interpret STM data. Our simulations clearly demonstrate the existence of a very stable DNA binding geometry for (6,5) CNT as evidenced by the presence of a well-defined minimum in the binding energy as a function of an angle between DNA strand and the nanotube chiral vector. This novel approach demonstrates the feasibility of CNT-DNA geometry studies with subnanometer resolution and paves the way towards complete characterization of the structural and electronic properties of drug-delivering systems based on DNA-CNT hybrids as a function of DNA sequence and a nanotube chirality.« less

  8. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    ERIC Educational Resources Information Center

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  9. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  10. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  11. Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Herrick, Gregory P.; Chen, Jen-Ping

    2012-01-01

    This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.

  12. Longitudinal train dynamics: an overview

    NASA Astrophysics Data System (ADS)

    Wu, Qing; Spiryagin, Maksym; Cole, Colin

    2016-12-01

    This paper discusses the evolution of longitudinal train dynamics (LTD) simulations, which covers numerical solvers, vehicle connection systems, air brake systems, wagon dumper systems and locomotives, resistance forces and gravitational components, vehicle in-train instabilities, and computing schemes. A number of potential research topics are suggested, such as modelling of friction, polymer, and transition characteristics for vehicle connection simulations, studies of wagon dumping operations, proper modelling of vehicle in-train instabilities, and computing schemes for LTD simulations. Evidence shows that LTD simulations have evolved with computing capabilities. Currently, advanced component models that directly describe the working principles of the operation of air brake systems, vehicle connection systems, and traction systems are available. Parallel computing is a good solution to combine and simulate all these advanced models. Parallel computing can also be used to conduct three-dimensional long train dynamics simulations.

  13. Constructing Neuronal Network Models in Massively Parallel Environments.

    PubMed

    Ippen, Tammo; Eppler, Jochen M; Plesser, Hans E; Diesmann, Markus

    2017-01-01

    Recent advances in the development of data structures to represent spiking neuron network models enable us to exploit the complete memory of petascale computers for a single brain-scale network simulation. In this work, we investigate how well we can exploit the computing power of such supercomputers for the creation of neuronal networks. Using an established benchmark, we divide the runtime of simulation code into the phase of network construction and the phase during which the dynamical state is advanced in time. We find that on multi-core compute nodes network creation scales well with process-parallel code but exhibits a prohibitively large memory consumption. Thread-parallel network creation, in contrast, exhibits speedup only up to a small number of threads but has little overhead in terms of memory. We further observe that the algorithms creating instances of model neurons and their connections scale well for networks of ten thousand neurons, but do not show the same speedup for networks of millions of neurons. Our work uncovers that the lack of scaling of thread-parallel network creation is due to inadequate memory allocation strategies and demonstrates that thread-optimized memory allocators recover excellent scaling. An analysis of the loop order used for network construction reveals that more complex tests on the locality of operations significantly improve scaling and reduce runtime by allowing construction algorithms to step through large networks more efficiently than in existing code. The combination of these techniques increases performance by an order of magnitude and harnesses the increasingly parallel compute power of the compute nodes in high-performance clusters and supercomputers.

  14. Constructing Neuronal Network Models in Massively Parallel Environments

    PubMed Central

    Ippen, Tammo; Eppler, Jochen M.; Plesser, Hans E.; Diesmann, Markus

    2017-01-01

    Recent advances in the development of data structures to represent spiking neuron network models enable us to exploit the complete memory of petascale computers for a single brain-scale network simulation. In this work, we investigate how well we can exploit the computing power of such supercomputers for the creation of neuronal networks. Using an established benchmark, we divide the runtime of simulation code into the phase of network construction and the phase during which the dynamical state is advanced in time. We find that on multi-core compute nodes network creation scales well with process-parallel code but exhibits a prohibitively large memory consumption. Thread-parallel network creation, in contrast, exhibits speedup only up to a small number of threads but has little overhead in terms of memory. We further observe that the algorithms creating instances of model neurons and their connections scale well for networks of ten thousand neurons, but do not show the same speedup for networks of millions of neurons. Our work uncovers that the lack of scaling of thread-parallel network creation is due to inadequate memory allocation strategies and demonstrates that thread-optimized memory allocators recover excellent scaling. An analysis of the loop order used for network construction reveals that more complex tests on the locality of operations significantly improve scaling and reduce runtime by allowing construction algorithms to step through large networks more efficiently than in existing code. The combination of these techniques increases performance by an order of magnitude and harnesses the increasingly parallel compute power of the compute nodes in high-performance clusters and supercomputers. PMID:28559808

  15. Examining the microtexture evolution in a hole-edge punched into 780 MPa grade hot-rolled steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, J.H.; Kim, M.S.

    The deformation behavior in the hole-edge of 780 MPa grade hot-rolled steel during the punching process was investigated via microstructure characterization and computational simulation. Microstructure characterization was conducted to observe the edges of punched holes through the thickness direction, and electron back-scattered diffraction (EBSD) was used to analyze the heterogeneity of the deformation. Finite element analysis (FEA) that could account for a ductile fracture criterion was conducted to simulate the deformation and fracture behaviors of 780 MPa grade hot-rolled steel during the punching process. Calculation of rotation rate fields at the edges of the punched holes during the punching processmore » revealed that metastable orientations in Euler space were confined to specific orientation groups. Rotation-rate fields effectively explained the stability of the initial texture components in the hole-edge region during the punching process. A visco-plastic self-consistent (VPSC) polycrystal model was used to calculate the microtexture evolution in the hole-edge region during the punching process. FEA revealed that the heterogeneous effective strain was closely related to the heterogeneity of the Kernel average misorientation (KAM) distribution in the hole-edge region. A simulation of the deformation microtexture evolution in the hole-edge region using a VPSC model was in good agreement with the experimental results. - Highlights: •We analyzed the microstructure in a hole-edge punched in HR 780HB steel. •Rotation rate fields revealed the stability of the initial texture components. •Heterogeneous effective stain was closely related to the KAM distribution. •VPSC model successfully simulated the deformation microtexture evolution.« less

  16. Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...

  17. Thermodynamic and transport properties of nitrogen fluid: Molecular theory and computer simulations

    NASA Astrophysics Data System (ADS)

    Eskandari Nasrabad, A.; Laghaei, R.

    2018-04-01

    Computer simulations and various theories are applied to compute the thermodynamic and transport properties of nitrogen fluid. To model the nitrogen interaction, an existing potential in the literature is modified to obtain a close agreement between the simulation results and experimental data for the orthobaric densities. We use the Generic van der Waals theory to calculate the mean free volume and apply the results within the modified Cohen-Turnbull relation to obtain the self-diffusion coefficient. Compared to experimental data, excellent results are obtained via computer simulations for the orthobaric densities, the vapor pressure, the equation of state, and the shear viscosity. We analyze the results of the theory and computer simulations for the various thermophysical properties.

  18. Development of an Output-based Adaptive Method for Multi-Dimensional Euler and Navier-Stokes Simulations

    NASA Technical Reports Server (NTRS)

    Darmofal, David L.

    2003-01-01

    The use of computational simulations in the prediction of complex aerodynamic flows is becoming increasingly prevalent in the design process within the aerospace industry. Continuing advancements in both computing technology and algorithmic development are ultimately leading to attempts at simulating ever-larger, more complex problems. However, by increasing the reliance on computational simulations in the design cycle, we must also increase the accuracy of these simulations in order to maintain or improve the reliability arid safety of the resulting aircraft. At the same time, large-scale computational simulations must be made more affordable so that their potential benefits can be fully realized within the design cycle. Thus, a continuing need exists for increasing the accuracy and efficiency of computational algorithms such that computational fluid dynamics can become a viable tool in the design of more reliable, safer aircraft. The objective of this research was the development of an error estimation and grid adaptive strategy for reducing simulation errors in integral outputs (functionals) such as lift or drag from from multi-dimensional Euler and Navier-Stokes simulations. In this final report, we summarize our work during this grant.

  19. GATE Monte Carlo simulation in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  20. Simulating Laboratory Procedures.

    ERIC Educational Resources Information Center

    Baker, J. E.; And Others

    1986-01-01

    Describes the use of computer assisted instruction in a medical microbiology course. Presents examples of how computer assisted instruction can present case histories in which the laboratory procedures are simulated. Discusses an authoring system used to prepare computer simulations and provides one example of a case history dealing with fractured…

  1. Real-time simulation of a spiking neural network model of the basal ganglia circuitry using general purpose computing on graphics processing units.

    PubMed

    Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi

    2011-11-01

    Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Inflight IFR procedures simulator

    NASA Technical Reports Server (NTRS)

    Parker, L. C. (Inventor)

    1984-01-01

    An inflight IFR procedures simulator for generating signals and commands to conventional instruments provided in an airplane is described. The simulator includes a signal synthesizer which generates predetermined simulated signals corresponding to signals normally received from remote sources upon being activated. A computer is connected to the signal synthesizer and causes the signal synthesizer to produce simulated signals responsive to programs fed into the computer. A switching network is connected to the signal synthesizer, the antenna of the aircraft, and navigational instruments and communication devices for selectively connecting instruments and devices to the synthesizer and disconnecting the antenna from the navigational instruments and communication device. Pressure transducers are connected to the altimeter and speed indicator for supplying electrical signals to the computer indicating the altitude and speed of the aircraft. A compass is connected for supply electrical signals for the computer indicating the heading of the airplane. The computer upon receiving signals from the pressure transducer and compass, computes the signals that are fed to the signal synthesizer which, in turn, generates simulated navigational signals.

  3. Modeling Effects of Annealing on Coal Char Reactivity to O 2 and CO 2 , Based on Preparation Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy; Bhat, Sham; Marcy, Peter

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less

  4. Modeling Effects of Annealing on Coal Char Reactivity to O 2 and CO 2 , Based on Preparation Conditions

    DOE PAGES

    Holland, Troy; Bhat, Sham; Marcy, Peter; ...

    2017-08-25

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less

  5. Competition of information channels in the spreading of innovations

    NASA Astrophysics Data System (ADS)

    Kocsis, Gergely; Kun, Ferenc

    2011-08-01

    We study the spreading of information on technological developments in socioeconomic systems where the social contacts of agents are represented by a network of connections. In the model, agents get informed about the existence and advantages of new innovations through advertising activities of producers, which are then followed by an interagent information transfer. Computer simulations revealed that varying the strength of external driving and of interagent coupling, furthermore, the topology of social contacts, the model presents a complex behavior with interesting novel features: On the macrolevel the system exhibits logistic behavior typical for the diffusion of innovations. The time evolution can be described analytically by an integral equation that captures the nucleation and growth of clusters of informed agents. On the microlevel, small clusters are found to be compact with a crossover to fractal structures with increasing size. The distribution of cluster sizes has a power-law behavior with a crossover to a higher exponent when long-range social contacts are present in the system. Based on computer simulations we construct an approximate phase diagram of the model on a regular square lattice of agents.

  6. Data Assimilation and Propagation of Uncertainty in Multiscale Cardiovascular Simulation

    NASA Astrophysics Data System (ADS)

    Schiavazzi, Daniele; Marsden, Alison

    2015-11-01

    Cardiovascular modeling is the application of computational tools to predict hemodynamics. State-of-the-art techniques couple a 3D incompressible Navier-Stokes solver with a boundary circulation model and can predict local and peripheral hemodynamics, analyze the post-operative performance of surgical designs and complement clinical data collection minimizing invasive and risky measurement practices. The ability of these tools to make useful predictions is directly related to their accuracy in representing measured physiologies. Tuning of model parameters is therefore a topic of paramount importance and should include clinical data uncertainty, revealing how this uncertainty will affect the predictions. We propose a fully Bayesian, multi-level approach to data assimilation of uncertain clinical data in multiscale circulation models. To reduce the computational cost, we use a stable, condensed approximation of the 3D model build by linear sparse regression of the pressure/flow rate relationship at the outlets. Finally, we consider the problem of non-invasively propagating the uncertainty in model parameters to the resulting hemodynamics and compare Monte Carlo simulation with Stochastic Collocation approaches based on Polynomial or Multi-resolution Chaos expansions.

  7. Modulation of phase transition of thermosensitive liposomes with leucine zipper-structured lipopeptides.

    PubMed

    Xu, Xiejun; Xiao, Xingqing; Wang, Yiming; Xu, Shouhong; Liu, Honglai

    2018-06-13

    Targeted therapy for cancer requires thermosensitive components in drug carriers for controlled drug release against viral cells. The conformational transition characteristic of leucine zipper-structured lipopeptides is utilized in our lab to modulate the phase transition temperature of liposomes, thus achieving temperature-responsive control. In this study, we computationally examined the conformational transition behaviors of leucine zipper-structured lipopeptides that were modified at the N-terminus by distinct functional groups. The conformational transition temperatures of these lipopeptides were determined by structural analysis of the implicit-solvent replica exchange molecular dynamics simulation trajectories using the dihedral angle principal component analysis and the dictionary of protein secondary structure method. Our calculations revealed that the computed transition temperatures of the lipopeptides are in good agreement with the experimental measurements. The effect of hydrogen bonds on the conformational stability of the lipopeptide dimers was examined in conventional explicit-solvent molecular dynamics simulations. A quantitative correlation of the degree of structural dissociation of the dimers and their binding strength is well described by an exponential fit of the binding free energies to the conformation transition temperatures of the lipopeptides.

  8. Competition of information channels in the spreading of innovations.

    PubMed

    Kocsis, Gergely; Kun, Ferenc

    2011-08-01

    We study the spreading of information on technological developments in socioeconomic systems where the social contacts of agents are represented by a network of connections. In the model, agents get informed about the existence and advantages of new innovations through advertising activities of producers, which are then followed by an interagent information transfer. Computer simulations revealed that varying the strength of external driving and of interagent coupling, furthermore, the topology of social contacts, the model presents a complex behavior with interesting novel features: On the macrolevel the system exhibits logistic behavior typical for the diffusion of innovations. The time evolution can be described analytically by an integral equation that captures the nucleation and growth of clusters of informed agents. On the microlevel, small clusters are found to be compact with a crossover to fractal structures with increasing size. The distribution of cluster sizes has a power-law behavior with a crossover to a higher exponent when long-range social contacts are present in the system. Based on computer simulations we construct an approximate phase diagram of the model on a regular square lattice of agents.

  9. Mechanism of ion adsorption to aqueous interfaces: Graphene/water vs. air/water.

    PubMed

    McCaffrey, Debra L; Nguyen, Son C; Cox, Stephen J; Weller, Horst; Alivisatos, A Paul; Geissler, Phillip L; Saykally, Richard J

    2017-12-19

    The adsorption of ions to aqueous interfaces is a phenomenon that profoundly influences vital processes in many areas of science, including biology, atmospheric chemistry, electrical energy storage, and water process engineering. Although classical electrostatics theory predicts that ions are repelled from water/hydrophobe (e.g., air/water) interfaces, both computer simulations and experiments have shown that chaotropic ions actually exhibit enhanced concentrations at the air/water interface. Although mechanistic pictures have been developed to explain this counterintuitive observation, their general applicability, particularly in the presence of material substrates, remains unclear. Here we investigate ion adsorption to the model interface formed by water and graphene. Deep UV second harmonic generation measurements of the SCN - ion, a prototypical chaotrope, determined a free energy of adsorption within error of that for air/water. Unlike for the air/water interface, wherein repartitioning of the solvent energy drives ion adsorption, our computer simulations reveal that direct ion/graphene interactions dominate the favorable enthalpy change. Moreover, the graphene sheets dampen capillary waves such that rotational anisotropy of the solute, if present, is the dominant entropy contribution, in contrast to the air/water interface.

  10. On-the-fly scheduling as a manifestation of partial-order planning and dynamic task values.

    PubMed

    Hannah, Samuel D; Neal, Andrew

    2014-09-01

    The aim of this study was to develop a computational account of the spontaneous task ordering that occurs within jobs as work unfolds ("on-the-fly task scheduling"). Air traffic control is an example of work in which operators have to schedule their tasks as a partially predictable work flow emerges. To date, little attention has been paid to such on-the-fly scheduling situations. We present a series of discrete-event models fit to conflict resolution decision data collected from experienced controllers operating in a high-fidelity simulation. Our simulations reveal air traffic controllers' scheduling decisions as examples of the partial-order planning approach of Hayes-Roth and Hayes-Roth. The most successful model uses opportunistic first-come-first-served scheduling to select tasks from a queue. Tasks with short deadlines are executed immediately. Tasks with long deadlines are evaluated to assess whether they need to be executed immediately or deferred. On-the-fly task scheduling is computationally tractable despite its surface complexity and understandable as an example of both the partial-order planning strategy and the dynamic-value approach to prioritization.

  11. Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks

    PubMed Central

    Kaltenbacher, Barbara; Hasenauer, Jan

    2017-01-01

    Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351

  12. Computational structural mechanics engine structures computational simulator

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1989-01-01

    The Computational Structural Mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures.

  13. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing

    PubMed Central

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-01

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4× speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration. PMID:28075343

  14. Soliton microdynamics of the generation of new-type nonlinear surface vibrations, dissociation, and surfing diffusion in diatomic crystals of the uranium nitride type

    NASA Astrophysics Data System (ADS)

    Dubovsky, O. A.; Semenov, V. A.; Orlov, A. V.; Sudarev, V. V.

    2014-09-01

    The microdynamics of large-amplitude nonlinear vibrations of uranium nitride diatomic lattices has been investigated using the computer simulation and neutron scattering methods at temperatures T = 600-2500°C near the thresholds of the dissociation and destruction of the reactor fuel materials. It has been found using the computer simulation that, in the spectral gap between the frequency bands of acoustic and optical phonons in crystals with an open surface, there are resonances of new-type harmonic surface vibrations and a gap-filling band of their genetic successors, i.e., nonlinear surface vibrations. Experimental measurements of the slow neutron scattering spectra of uranium nitride on the DIN-2PI neutron spectrometer have revealed resonances and bands of these surface vibrations in the spectral gap, as well as higher optical vibration overtones. It has been shown that the solitons and bisolitons initiate the formation and collapse of dynamic pores with the generation of surface vibrations at the boundaries of the cavities, evaporation of atoms and atomic clusters, formation of cracks, and destruction of the material. It has been demonstrated that the mass transfer of nitrogen in cracks and along grain boundaries can occur through the revealed microdynamics mechanism of the surfing diffusion of light nitrogen atoms at large-amplitude soliton waves propagating in the stabilizing sublattice of heavy uranium atoms and in the nitrogen sublattice.

  15. How Effective Is Instructional Support for Learning with Computer Simulations?

    ERIC Educational Resources Information Center

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  16. SIGMA--A Graphical Approach to Teaching Simulation.

    ERIC Educational Resources Information Center

    Schruben, Lee W.

    1992-01-01

    SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…

  17. Application of technology developed for flight simulation at NASA. Langley Research Center

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1991-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.

  18. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  19. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  20. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  1. Statistical Trajectory Estimation Program (STEP) implementation for BLDT post flight trajectory simulation

    NASA Technical Reports Server (NTRS)

    Shields, W. E.

    1973-01-01

    Tests were conducted to provide flight conditions for qualifying the Viking Decelerator System in a simulated Mars environment. A balloon launched decelerator test (BLDT) vehicle which has an external shape similar to the actual Mars Viking Lander Capsule was used so that the decelerator would be deployed in the wake of a blunt body. An effort was made to simulate the BLDT vehicle flights from the time they were dropped from the balloon, through decelerator deployment, until stable decelerator conditions were reached. The procedure used to simulate these flights using the Statistical Trajectory Estimation Program (STEP) is discussed. Using primarily ground-based position radar and vehicle onboard rate gyro and accelerometer data, the STEP produces a minimum variance solution of the vehicle trajectory and calculates vehicle attitude histories. Using film from cameras in the vehicle along with a computer program, attitude histories for portions of the flight before and after decelerator deployment were calculated independent of the STEP simulation. With the assumption that the vehicle motions derived from camera data are accurate, a comparison reveals that STEP was able to simulate vehicle motions for all flights both before and after decelerator deployment.

  2. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  3. Development of an E-Prime Based Computer Simulation of an Interactive Human Rights Violation Negotiation Script (Developpement d’un Programme de Simulation par Ordinateur Fonde sur le Logiciel E Prime pour la Negociation Interactive en cas de Violation des Droits de la Personne)

    DTIC Science & Technology

    2010-12-01

    Base ( CFB ) Kingston. The computer simulation developed in this project is intended to be used for future research and as a possible training platform...DRDC Toronto No. CR 2010-055 Development of an E-Prime based computer simulation of an interactive Human Rights Violation negotiation script...Abstract This report describes the method of developing an E-Prime computer simulation of an interactive Human Rights Violation (HRV) negotiation. An

  4. Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock

    This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less

  5. Conformational Transition Pathways in Signaling and Enzyme Catalysis Explored by Computational Methods

    NASA Astrophysics Data System (ADS)

    Pachov, Dimitar V.

    Biomolecules are dynamic in nature and visit a number of states while performing their biological function. However, understanding how they interconvert between functional substates is a challenging task. In this thesis, we employ enhanced computational strategies to reveal in atomistic resolution transition states and molecular mechanism along conformational pathways of the signaling protein Nitrogen Regulatory Protein C (NtrC) and the enzyme Adenylate Kinase (Adk). Targeted Molecular Dynamics (TMD) simulations and NMR experiments have previously found the active/inactive interconversion of NtrC is stabilized by non-native transient contacts. To find where along the conformational pathway they lie and probe the existence of multiple intermediates, a beyond 8mus-extensive mapping of the conformational landscape was performed by a multitude of straightforward MD simulations relaxed from the biased TMD pathway. A number of metastable states stabilized by local interactions was found to underline the conformational pathway of NtrC. Two spontaneous transitions of the last stage of the active-to-inactive conversion were identified and used in path sampling procedures to generate an ensemble of truly dynamic reactive pathways. The transition state ensemble (TSE) and mechanistic descriptors of this transition were revealed in atomic detail and verified by committor analysis. By analyzing how pressure affects the dynamics and function of two homologous Adk proteins - the P.Profundum Adk surviving at 700atm pressure in the deep sea, and the E. coli Adk that lives at ambient pressures - we indirectly obtained atomic information about the TSE of the large-amplitude rate-limiting conformational opening of the Adk lids. Guided by NMR experiments showing significantly decreased activation volumes of the piezophile compared to its mesophilic counterpart, TMD simulations revealed the formation of an extended hydrogen-bonded water network in the transition state of the piezophile that can explain the experimentally measured activation volume differences. The transition state of the conformational change was proposed to lie close to the closed state. Additionally, a number of descriptors were used to characterize the free energy landscape of the mesophile. It was found that the features of landscape are highly sensitive to the binding of different ligands, their protonation states and the presence of magnesium.

  6. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  7. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  8. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    NASA Astrophysics Data System (ADS)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  9. A study of workstation computational performance for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  10. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  11. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    PubMed

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Insights on surface spalling of rock

    NASA Astrophysics Data System (ADS)

    Tarokh, Ali; Kao, Chu-Shu; Fakhimi, Ali; Labuz, Joseph F.

    2016-07-01

    Surface spalling is a complex failure phenomenon that features crack propagation and detachment of thin pieces of rock near free surfaces, particularly in brittle rock around underground excavations when large in situ stresses are involved. A surface instability apparatus was used to study failure of rock close to a free surface, and damage evolution was monitored by digital image correlation (DIC). Lateral displacement at the free face was used as the feedback signal to control the post-peak response of the specimen. DIC was implemented in order to obtain the incremental displacement fields during the spalling process. Displacement fields were computed in the early stage of loading as well as close to the peak stress. Fracture from the spalling phenomenon was revealed by incremental lateral displacement contours. The axial and lateral displacements suggested that the displacement gradient was uniform in both directions at early loading stages and as the load increased, the free-face effect started to influence the displacements, especially the lateral displacement field. A numerical approach, based on the discrete element method, was developed and validated from element testing. Damage evolution and localization observed in numerical simulations were similar to those observed in experiments. By performing simulations in two- and three-dimensions, it was revealed that the intermediate principal stress and platen-rock interfaces have important effects on simulation of surface spalling.

  13. Inhibition of cytochrome P450 3A by acetoxylated analogues of resveratrol in in vitro and in silico models

    NASA Astrophysics Data System (ADS)

    Basheer, Loai; Schultz, Keren; Kerem, Zohar

    2016-08-01

    Many dietary compounds, including resveratrol, are potent inhibitors of CYP3A4. Here we examined the potential to predict inhibition capacity of dietary polyphenolics using an in silico and in vitro approaches and synthetic model compounds. Mono, di, and tri-acetoxy resveratrol were synthesized, a cell line of human intestine origin and microsomes from rat liver served to determine their in vitro inhibition of CYP3A4, and compared to that of resveratrol. Docking simulation served to predict the affinity of the synthetic model compounds to the enzyme. Modelling of the enzyme’s binding site revealed three types of interaction: hydrophobic, electrostatic and H-bonding. The simulation revealed that each of the examined acetylations of resveratrol led to the loss of important interactions of all types. Tri-acetoxy resveratrol was the weakest inhibitor in vitro despite being the more lipophilic and having the highest affinity for the binding site. The simulation demonstrated exclusion of all interactions between tri-acetoxy resveratrol and the heme due to distal binding, highlighting the complexity of the CYP3A4 binding site, which may allow simultaneous accommodation of two molecules. Finally, the use of computational modelling may serve as a quick predictive tool to identify potential harmful interactions between dietary compounds and prescribed drugs.

  14. Comparison of validation methods for forming simulations

    NASA Astrophysics Data System (ADS)

    Schug, Alexander; Kapphan, Gabriel; Bardl, Georg; Hinterhölzl, Roland; Drechsler, Klaus

    2018-05-01

    The forming simulation of fibre reinforced thermoplastics could reduce the development time and improve the forming results. But to take advantage of the full potential of the simulations it has to be ensured that the predictions for material behaviour are correct. For that reason, a thorough validation of the material model has to be conducted after characterising the material. Relevant aspects for the validation of the simulation are for example the outer contour, the occurrence of defects and the fibre paths. To measure these features various methods are available. Most relevant and also most difficult to measure are the emerging fibre orientations. For that reason, the focus of this study was on measuring this feature. The aim was to give an overview of the properties of different measuring systems and select the most promising systems for a comparison survey. Selected were an optical, an eddy current and a computer-assisted tomography system with the focus on measuring the fibre orientations. Different formed 3D parts made of unidirectional glass fibre and carbon fibre reinforced thermoplastics were measured. Advantages and disadvantages of the tested systems were revealed. Optical measurement systems are easy to use, but are limited to the surface plies. With an eddy current system also lower plies can be measured, but it is only suitable for carbon fibres. Using a computer-assisted tomography system all plies can be measured, but the system is limited to small parts and challenging to evaluate.

  15. Perceptual control models of pursuit manual tracking demonstrate individual specificity and parameter consistency.

    PubMed

    Parker, Maximilian G; Tyson, Sarah F; Weightman, Andrew P; Abbott, Bruce; Emsley, Richard; Mansell, Warren

    2017-11-01

    Computational models that simulate individuals' movements in pursuit-tracking tasks have been used to elucidate mechanisms of human motor control. Whilst there is evidence that individuals demonstrate idiosyncratic control-tracking strategies, it remains unclear whether models can be sensitive to these idiosyncrasies. Perceptual control theory (PCT) provides a unique model architecture with an internally set reference value parameter, and can be optimized to fit an individual's tracking behavior. The current study investigated whether PCT models could show temporal stability and individual specificity over time. Twenty adults completed three blocks of 15 1-min, pursuit-tracking trials. Two blocks (training and post-training) were completed in one session and the third was completed after 1 week (follow-up). The target moved in a one-dimensional, pseudorandom pattern. PCT models were optimized to the training data using a least-mean-squares algorithm, and validated with data from post-training and follow-up. We found significant inter-individual variability (partial η 2 : .464-.697) and intra-individual consistency (Cronbach's α: .880-.976) in parameter estimates. Polynomial regression revealed that all model parameters, including the reference value parameter, contribute to simulation accuracy. Participants' tracking performances were significantly more accurately simulated by models developed from their own tracking data than by models developed from other participants' data. We conclude that PCT models can be optimized to simulate the performance of an individual and that the test-retest reliability of individual models is a necessary criterion for evaluating computational models of human performance.

  16. A case for spiking neural network simulation based on configurable multiple-FPGA systems.

    PubMed

    Yang, Shufan; Wu, Qiang; Li, Renfa

    2011-09-01

    Recent neuropsychological research has begun to reveal that neurons encode information in the timing of spikes. Spiking neural network simulations are a flexible and powerful method for investigating the behaviour of neuronal systems. Simulation of the spiking neural networks in software is unable to rapidly generate output spikes in large-scale of neural network. An alternative approach, hardware implementation of such system, provides the possibility to generate independent spikes precisely and simultaneously output spike waves in real time, under the premise that spiking neural network can take full advantage of hardware inherent parallelism. We introduce a configurable FPGA-oriented hardware platform for spiking neural network simulation in this work. We aim to use this platform to combine the speed of dedicated hardware with the programmability of software so that it might allow neuroscientists to put together sophisticated computation experiments of their own model. A feed-forward hierarchy network is developed as a case study to describe the operation of biological neural systems (such as orientation selectivity of visual cortex) and computational models of such systems. This model demonstrates how a feed-forward neural network constructs the circuitry required for orientation selectivity and provides platform for reaching a deeper understanding of the primate visual system. In the future, larger scale models based on this framework can be used to replicate the actual architecture in visual cortex, leading to more detailed predictions and insights into visual perception phenomenon.

  17. Feed-forward and reciprocal inhibition for gain and phase timing control in a computational model of repetitive cough

    PubMed Central

    Morris, Kendall F.; Segers, Lauren S.; Poliacek, Ivan; Rose, Melanie J.; Lindsey, Bruce G.; Davenport, Paul W.; Howland, Dena R.; Bolser, Donald C.

    2016-01-01

    We investigated the hypothesis, motivated in part by a coordinated computational cough network model, that second-order neurons in the nucleus tractus solitarius (NTS) act as a filter and shape afferent input to the respiratory network during the production of cough. In vivo experiments were conducted on anesthetized spontaneously breathing cats. Cough was elicited by mechanical stimulation of the intrathoracic airways. Electromyograms of the parasternal (inspiratory) and rectus abdominis (expiratory) muscles and esophageal pressure were recorded. In vivo data revealed that expiratory motor drive during bouts of repetitive coughs is variable: peak expulsive amplitude increases from the first cough, peaks about the eighth or ninth cough, and then decreases through the remainder of the bout. Model simulations indicated that feed-forward inhibition of a single second-order neuron population is not sufficient to account for this dynamic feature of a repetitive cough bout. When a single second-order population was split into two subpopulations (inspiratory and expiratory), the resultant model produced simulated expiratory motor bursts that were comparable to in vivo data. However, expiratory phase durations during these simulations of repetitive coughing had less variance than those in vivo. Simulations in which reciprocal inhibitory processes between inspiratory-decrementing and expiratory-augmenting-late neurons were introduced exhibited increased variance in the expiratory phase durations. These results support the prediction that serial and parallel processing of airway afferent signals in the NTS play a role in generation of the motor pattern for cough. PMID:27283917

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prein, Andreas; Langhans, Wolfgang; Fosser, Giorgia

    Regional climate modeling using convection permitting models (CPMs) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs). CPMs do not use convection parameterization schemes, known as a major source of errors and uncertainties, and have more accurate surface and orography elds. The drawback of CPMs is their high demand on computational resources. For this reason, the CPM climate simulations only appeared a decade ago. In this study we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic.more » The most important components in CPM, such as physical parameterizations and dynamical formulations are discussed, and an outlook on required future developments and computer architectures that would support the application of CPMs is given. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Most improvements are found for processes related to deep convection (e.g., precipitation during summer), for mountainous regions, and for the soil-vegetation-atmosphere interactions. The climate change signals of CPM simulations reveal increases in short and extreme rainfall events and an increased ratio of liquid precipitation at the surface (a decrease of hail) potentially leading to more frequent ash oods. Concluding, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to assess their full potential and support their development.« less

  19. A review on regional convection-permitting climate modeling: Demonstrations, prospects, and challenges

    DOE PAGES

    Prein, Andreas; Langhans, Wolfgang; Fosser, Giorgia; ...

    2015-05-27

    Regional climate modeling using convection permitting models (CPMs) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs). CPMs do not use convection parameterization schemes, known as a major source of errors and uncertainties, and have more accurate surface and orography elds. The drawback of CPMs is their high demand on computational resources. For this reason, the CPM climate simulations only appeared a decade ago. In this study we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic.more » The most important components in CPM, such as physical parameterizations and dynamical formulations are discussed, and an outlook on required future developments and computer architectures that would support the application of CPMs is given. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Most improvements are found for processes related to deep convection (e.g., precipitation during summer), for mountainous regions, and for the soil-vegetation-atmosphere interactions. The climate change signals of CPM simulations reveal increases in short and extreme rainfall events and an increased ratio of liquid precipitation at the surface (a decrease of hail) potentially leading to more frequent ash oods. Concluding, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to assess their full potential and support their development.« less

  20. Numerical Investigation of Plasma Detachment in Magnetic Nozzle Experiments

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2008-01-01

    At present there exists no generally accepted theoretical model that provides a consistent physical explanation of plasma detachment from an externally-imposed magnetic nozzle. To make progress towards that end, simulation of plasma flow in the magnetic nozzle of an arcjet experiment is performed using a multidimensional numerical simulation tool that includes theoretical models of the various dispersive and dissipative processes present in the plasma. This is an extension of the simulation tool employed in previous work by Sankaran et al. The aim is to compare the computational results with various proposed magnetic nozzle detachment theories to develop an understanding of the physical mechanisms that cause detachment. An applied magnetic field topology is obtained using a magnetostatic field solver (see Fig. I), and this field is superimposed on the time-dependent magnetic field induced in the plasma to provide a self-consistent field description. The applied magnetic field and model geometry match those found in experiments by Kuriki and Okada. This geometry is modeled because there is a substantial amount of experimental data that can be compared to the computational results, allowing for validation of the model. In addition, comparison of the simulation results with the experimentally obtained plasma parameters will provide insight into the mechanisms that lead to plasma detachment, revealing how they scale with different input parameters. Further studies will focus on modeling literature experiments both for the purpose of additional code validation and to extract physical insight regarding the mechanisms driving detachment.

  1. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  2. Antimicrobial Peptide Simulations and the Influence of Force Field on the Free Energy for Pore Formation in Lipid Bilayers.

    PubMed

    Bennett, W F Drew; Hong, Chun Kit; Wang, Yi; Tieleman, D Peter

    2016-09-13

    Due to antimicrobial resistance, the development of new drugs to combat bacterial and fungal infections is an important area of research. Nature uses short, charged, and amphipathic peptides for antimicrobial defense, many of which disrupt the lipid membrane in addition to other possible targets inside the cell. Computer simulations have revealed atomistic details for the interactions of antimicrobial peptides and cell-penetrating peptides with lipid bilayers. Strong interactions between the polar interface and the charged peptides can induce bilayer deformations - including membrane rupture and peptide stabilization of a hydrophilic pore. Here, we performed microsecond-long simulations of the antimicrobial peptide CM15 in a POPC bilayer expecting to observe pore formation (based on previous molecular dynamics simulations). We show that caution is needed when interpreting results of equilibrium peptide-membrane simulations, given the length of time single trajectories can dwell in local energy minima for 100's of ns to microseconds. While we did record significant membrane perturbations from the CM15 peptide, pores were not observed. We explain this discrepancy by computing the free energy for pore formation with different force fields. Our results show a large difference in the free energy barrier (ca. 40 kJ/mol) against pore formation predicted by the different force fields that would result in orders of magnitude differences in the simulation time required to observe spontaneous pore formation. This explains why previous simulations using the Berger lipid parameters reported pores induced by charged peptides, while with CHARMM based models pores were not observed in our long time-scale simulations. We reconcile some of the differences in the distance dependent free energies by shifting the free energy profiles to account for thickness differences between force fields. The shifted curves show that all the models describe small defects in lipid bilayers in a consistent manner, suggesting a common physical basis.

  3. Computer Simulation in Undergraduate Instruction: A Symposium.

    ERIC Educational Resources Information Center

    Street, Warren R.; And Others

    These symposium papers discuss the instructional use of computers in psychology, with emphasis on computer-produced simulations. The first, by Rich Edwards, briefly outlines LABSIM, a general purpose system of FORTRAN programs which simulate data collection in more than a dozen experimental models in psychology and are designed to train students…

  4. Overview of Computer Simulation Modeling Approaches and Methods

    Treesearch

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  5. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  6. Computer modeling and simulation of human movement. Applications in sport and rehabilitation.

    PubMed

    Neptune, R R

    2000-05-01

    Computer modeling and simulation of human movement plays an increasingly important role in sport and rehabilitation, with applications ranging from sport equipment design to understanding pathologic gait. The complex dynamic interactions within the musculoskeletal and neuromuscular systems make analyzing human movement with existing experimental techniques difficult but computer modeling and simulation allows for the identification of these complex interactions and causal relationships between input and output variables. This article provides an overview of computer modeling and simulation and presents an example application in the field of rehabilitation.

  7. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  8. Generalized dynamic engine simulation techniques for the digital computer

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1974-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.

  9. Generalized dynamic engine simulation techniques for the digital computer

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1974-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.

  10. Generalized dynamic engine simulation techniques for the digital computers

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1975-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar digital programs on future engine simulation philosophy is also discussed.

  11. Fast Learning for Immersive Engagement in Energy Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less

  12. A Primer on Simulation and Gaming.

    ERIC Educational Resources Information Center

    Barton, Richard F.

    In a primer intended for the administrative professions, for the behavioral sciences, and for education, simulation and its various aspects are defined, illustrated, and explained. Man-model simulation, man-computer simulation, all-computer simulation, and analysis are discussed as techniques for studying object systems (parts of the "real…

  13. Experimental Investigation of Spatially-Periodic Scalar Patterns in an Inline Mixer

    NASA Astrophysics Data System (ADS)

    Baskan, Ozge; Speetjens, Michel F. M.; Clercx, Herman J. H.

    2015-11-01

    Spatially persisting patterns with exponentially decaying intensities form during the downstream evolution of passive scalars in three-dimensional (3D) spatially periodic flows due to the coupled effect of the chaotic nature of the flow and the diffusivity of the material. This has been investigated in many computational and theoretical studies on 3D spatially-periodic flow fields. However, in the limit of zero-diffusivity, the evolution of the scalar fields results in more detailed structures that can only be captured by experiments due to limitations in the computational tools. Our study employs the-state-of-the-art experimental methods to analyze the evolution of 3D advective scalar field in a representative inline mixer, called Quatro static mixer. The experimental setup consists of an optically accessible test section with transparent internal elements, accommodating a pressure-driven pipe flow and equipped with 3D Laser-Induced Fluorescence. The results reveal that the continuous process of stretching and folding of material creates finer structures as the flow progresses, which is an indicator of chaotic advection and the experiments outperform the simulations by revealing far greater level of detail.

  14. Sum Frequency Generation Spectroscopy and Molecular Dynamics Simulations Reveal a Rotationally Fluid Adsorption State of α-Pinene on Silica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Junming; Psciuk, Brian T.; Chase, Hilary M.

    2016-06-16

    A rotationally fluid state of α-pinene at fused silica/vapor interfaces is revealed by computational and experimental vibrational sum frequency generation (SFG) studies. We report the first assignment of the vibrational modes in the notoriously congested C-H stretching region of α-pinene and identify its bridge methylene group on the four-membered ring ("βCH2") as the origin of its dominant spectral feature. We find that the spectra are perfused with Fermi resonances that need to be accounted for explicitly in the computation of vibrational spectra of strained hydrocarbons such α-pinene. The preferred orientations of α-pinene are consistent with optimization of van der Waalsmore » contacts with the silica surface that results in a bimodal distribution of highly fluxional orientations in which the βCH2 group points "towards" or "away from” the surface. The reported findings are particularly relevant to the exposure of α-pinene to primary oxidants in heterogeneous catalytic pathways that exploit α-pinene as a sustainable feedstock for fine chemicals and polymers.« less

  15. Aerodynamics of Stardust Sample Return Capsule

    NASA Technical Reports Server (NTRS)

    Mitcheltree, R. A.; Wilmoth, R. G.; Cheatwood, F. M.; Brauckmann, G. J.; Greene, F. A.

    1997-01-01

    Successful return of interstellar dust and cometary material by the Stardust Sample Return Capsule requires an accurate description of the Earth entry vehicle's aerodynamics. This description must span the hypersonic-rarefied, hypersonic-continuum, supersonic, transonic, and subsonic flow regimes. Data from numerous sources are compiled to accomplish this objective. These include Direct Simulation Monte Carlo analyses, thermochemical nonequilibrium computational fluid dynamics, transonic computational fluid dynamics, existing wind tunnel data, and new wind tunnel data. Four observations are highlighted: 1) a static instability is revealed in the free-molecular and early transitional-flow regime due to aft location of the vehicle s center-of-gravity, 2) the aerodynamics across the hypersonic regime are compared with the Newtonian flow approximation and a correlation between the accuracy of the Newtonian flow assumption and the sonic line position is noted, 3) the primary effect of shape change due to ablation is shown to be a reduction in drag, and 4) a subsonic dynamic instability is revealed which will necessitate either a change in the vehicle s center-of-gravity location or the use of a stabilizing drogue parachute.

  16. Nasal conchae function as aerodynamic baffles: Experimental computational fluid dynamic analysis in a turkey nose (Aves: Galliformes).

    PubMed

    Bourke, Jason M; Witmer, Lawrence M

    2016-12-01

    We tested the aerodynamic function of nasal conchae in birds using CT data from an adult male wild turkey (Meleagris gallopavo) to construct 3D models of its nasal passage. A series of digital "turbinectomies" were performed on these models and computational fluid dynamic analyses were performed to simulate resting inspiration. Models with turbinates removed were compared to the original, unmodified control airway. Results revealed that the four conchae found in turkeys, along with the crista nasalis, alter the flow of inspired air in ways that can be considered baffle-like. However, these baffle-like functions were remarkably limited in their areal extent, indicating that avian conchae are more functionally independent than originally hypothesized. Our analysis revealed that the conchae of birds are efficient baffles that-along with potential heat and moisture transfer-serve to efficiently move air to specific regions of the nasal passage. This alternate function of conchae has implications for their evolution in birds and other amniotes. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    NASA Astrophysics Data System (ADS)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  18. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  19. Effect of computer game playing on baseline laparoscopic simulator skills.

    PubMed

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  20. A Computer Simulation of Bacterial Growth During Food-Processing

    DTIC Science & Technology

    1974-11-01

    1 AD A TECHNICAL REPORT A COMPUTER SIMULATION OF BACTERIAL GROWTH DURING FOOD PROCESSING =r= by Edward W. Ross, Jr. Approved for public...COMPUTER SIMULATION OF BACTERIAL GROWTH DURING FOOD - PROCESSING Edward W. Ross, Jr. Army Natick Laboratories Natick, Massachusetts Novembe...CATALOG NUMBER 4. TITLE fand SubtKUJ "A Computer Sinulatlon of Bacterial Growth During Food - Processing " 5. TYPE OF REPORT A PERIOD COVERED 6

  1. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  2. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  3. Computational Design of High-χ Block Oligomers for Accessing 1 nm Domains.

    PubMed

    Chen, Qile P; Barreda, Leonel; Oquendo, Luis E; Hillmyer, Marc A; Lodge, Timothy P; Siepmann, J Ilja

    2018-05-22

    Molecular dynamics simulations are used to design a series of high-χ block oligomers (HCBOs) that can self-assemble into a variety of mesophases with domain sizes as small as 1 nm. The exploration of these oligomers with various chain lengths, volume fractions, and chain architectures at multiple temperatures reveals the presence of ordered lamellae, perforated lamellae, and hexagonally packed cylinders. The achieved periods are as small as 3.0 and 2.1 nm for lamellae and cylinders, respectively, which correspond to polar domains of approximately 1 nm. Interestingly, the detailed phase behavior of these oligomers is distinct from that of either solvent-free surfactants or block polymers. The simulations reveal that the behavior of these HCBOs is a product of an interplay between both "surfactant factors" (headgroup interactions, chain flexibility, and interfacial curvature) and "block polymer factors" (χ, chain length N, and volume fraction f). This insight promotes the understanding of molecular features pivotal for mesophase formation at the sub-5 nm length scale, which facilitates the design of HCBOs tailored toward particular desired morphologies.

  4. Statistical physics approaches to Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Peng, Shouyong

    Alzheimer's disease (AD) is the most common cause of late life dementia. In the brain of an AD patient, neurons are lost and spatial neuronal organizations (microcolumns) are disrupted. An adequate quantitative analysis of microcolumns requires that we automate the neuron recognition stage in the analysis of microscopic images of human brain tissue. We propose a recognition method based on statistical physics. Specifically, Monte Carlo simulations of an inhomogeneous Potts model are applied for image segmentation. Unlike most traditional methods, this method improves the recognition of overlapped neurons, and thus improves the overall recognition percentage. Although the exact causes of AD are unknown, as experimental advances have revealed the molecular origin of AD, they have continued to support the amyloid cascade hypothesis, which states that early stages of aggregation of amyloid beta (Abeta) peptides lead to neurodegeneration and death. X-ray diffraction studies reveal the common cross-beta structural features of the final stable aggregates-amyloid fibrils. Solid-state NMR studies also reveal structural features for some well-ordered fibrils. But currently there is no feasible experimental technique that can reveal the exact structure or the precise dynamics of assembly and thus help us understand the aggregation mechanism. Computer simulation offers a way to understand the aggregation mechanism on the molecular level. Because traditional all-atom continuous molecular dynamics simulations are not fast enough to investigate the whole aggregation process, we apply coarse-grained models and discrete molecular dynamics methods to increase the simulation speed. First we use a coarse-grained two-bead (two beads per amino acid) model. Simulations show that peptides can aggregate into multilayer beta-sheet structures, which agree with X-ray diffraction experiments. To better represent the secondary structure transition happening during aggregation, we refine the model to four beads per amino acid. Typical essential interactions, such as backbone hydrogen bond, hydrophobic and electrostatic interactions, are incorporated into our model. We study the aggregation of Abeta16-22, a peptide that can aggregate into a well-ordered fibrillar structure in experiments. Our results show that randomly-oriented monomers can aggregate into fibrillar subunits, which agree not only with X-ray diffraction experiments but also with solid-state NMR studies. Our findings demonstrate that coarse-grained models and discrete molecular dynamics simulations can help researchers understand the aggregation mechanism of amyloid peptides.

  5. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, David J.; Reynolds, Daniel R.

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  6. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE PAGES

    Gardner, David J.; Reynolds, Daniel R.

    2017-01-05

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  7. Signature modelling and radiometric rendering equations in infrared scene simulation systems

    NASA Astrophysics Data System (ADS)

    Willers, Cornelius J.; Willers, Maria S.; Lapierre, Fabian

    2011-11-01

    The development and optimisation of modern infrared systems necessitates the use of simulation systems to create radiometrically realistic representations (e.g. images) of infrared scenes. Such simulation systems are used in signature prediction, the development of surveillance and missile sensors, signal/image processing algorithm development and aircraft self-protection countermeasure system development and evaluation. Even the most cursory investigation reveals a multitude of factors affecting the infrared signatures of realworld objects. Factors such as spectral emissivity, spatial/volumetric radiance distribution, specular reflection, reflected direct sunlight, reflected ambient light, atmospheric degradation and more, all affect the presentation of an object's instantaneous signature. The signature is furthermore dynamically varying as a result of internal and external influences on the object, resulting from the heat balance comprising insolation, internal heat sources, aerodynamic heating (airborne objects), conduction, convection and radiation. In order to accurately render the object's signature in a computer simulation, the rendering equations must therefore account for all the elements of the signature. In this overview paper, the signature models, rendering equations and application frameworks of three infrared simulation systems are reviewed and compared. The paper first considers the problem of infrared scene simulation in a framework for simulation validation. This approach provides concise definitions and a convenient context for considering signature models and subsequent computer implementation. The primary radiometric requirements for an infrared scene simulator are presented next. The signature models and rendering equations implemented in OSMOSIS (Belgian Royal Military Academy), DIRSIG (Rochester Institute of Technology) and OSSIM (CSIR & Denel Dynamics) are reviewed. In spite of these three simulation systems' different application focus areas, their underlying physics-based approach is similar. The commonalities and differences between the different systems are investigated, in the context of their somewhat different application areas. The application of an infrared scene simulation system towards the development of imaging missiles and missile countermeasures are briefly described. Flowing from the review of the available models and equations, recommendations are made to further enhance and improve the signature models and rendering equations in infrared scene simulators.

  8. An Investigation of Computer-based Simulations for School Crises Management.

    ERIC Educational Resources Information Center

    Degnan, Edward; Bozeman, William

    2001-01-01

    Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)

  9. Problem-Solving in the Pre-Clinical Curriculum: The Uses of Computer Simulations.

    ERIC Educational Resources Information Center

    Michael, Joel A.; Rovick, Allen A.

    1986-01-01

    Promotes the use of computer-based simulations in the pre-clinical medical curriculum as a means of providing students with opportunities for problem solving. Describes simple simulations of skeletal muscle loads, complex simulations of major organ systems and comprehensive simulation models of the entire human body. (TW)

  10. Computer considerations for real time simulation of a generalized rotor model

    NASA Technical Reports Server (NTRS)

    Howe, R. M.; Fogarty, L. E.

    1977-01-01

    Scaled equations were developed to meet requirements for real time computer simulation of the rotor system research aircraft. These equations form the basis for consideration of both digital and hybrid mechanization for real time simulation. For all digital simulation estimates of the required speed in terms of equivalent operations per second are developed based on the complexity of the equations and the required intergration frame rates. For both conventional hybrid simulation and hybrid simulation using time-shared analog elements the amount of required equipment is estimated along with a consideration of the dynamic errors. Conventional hybrid mechanization using analog simulation of those rotor equations which involve rotor-spin frequencies (this consititutes the bulk of the equations) requires too much analog equipment. Hybrid simulation using time-sharing techniques for the analog elements appears possible with a reasonable amount of analog equipment. All-digital simulation with affordable general-purpose computers is not possible because of speed limitations, but specially configured digital computers do have the required speed and consitute the recommended approach.

  11. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  12. Black Hole Mergers, Gravitational Waves, and Multi-Messenger Astronomy

    NASA Technical Reports Server (NTRS)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes is expected to be the strongest source of gravitational waves for both ground-based detectors such as LIGO and VIRGO, as well as the space-based LISA. Since the merger takes place in the regime of strong dynamical gravity, computing the resulting gravitational waveforms requires solving the full Einstein equations of general relativity on a computer. Although numerical codes designed to simulate black hole mergers were plagued for many years by a host of instabilities, recent breakthroughs have conquered these problems and opened up this field dramatically. This talk will focus on the resulting gold rush of new results that is revealing the dynamics and waveforms of binary black hole mergers, and their applications in gravitational wave detection, astrophysics, and testing general relativity.

  13. Wormlike Chain Theory and Bending of Short DNA

    NASA Astrophysics Data System (ADS)

    Mazur, Alexey K.

    2007-05-01

    The probability distributions for bending angles in double helical DNA obtained in all-atom molecular dynamics simulations are compared with theoretical predictions. The computed distributions remarkably agree with the wormlike chain theory and qualitatively differ from predictions of the subelastic chain model. The computed data exhibit only small anomalies in the apparent flexibility of short DNA and cannot account for the recently reported AFM data. It is possible that the current atomistic DNA models miss some essential mechanisms of DNA bending on intermediate length scales. Analysis of bent DNA structures reveal, however, that the bending motion is structurally heterogeneous and directionally anisotropic on the length scales where the experimental anomalies were detected. These effects are essential for interpretation of the experimental data and they also can be responsible for the apparent discrepancy.

  14. Comparison between measured and computed magnetic flux density distribution of simulated transformer core joints assembled from grain-oriented and non-oriented electrical steel

    NASA Astrophysics Data System (ADS)

    Shahrouzi, Hamid; Moses, Anthony J.; Anderson, Philip I.; Li, Guobao; Hu, Zhuochao

    2018-04-01

    The flux distribution in an overlapped linear joint constructed in the central region of an Epstein Square was studied experimentally and results compared with those obtained using a computational magnetic field solver. High permeability grain-oriented (GO) and low permeability non-oriented (NO) electrical steels were compared at a nominal core flux density of 1.60 T at 50 Hz. It was found that the experimental results only agreed well at flux densities at which the reluctance of different paths of the flux are similar. Also it was revealed that the flux becomes more uniform when the working point of the electrical steel is close to the knee point of the B-H curve of the steel.

  15. Frequency-selective near-field radiative heat transfer between photonic crystal slabs: a computational approach for arbitrary geometries and materials.

    PubMed

    Rodriguez, Alejandro W; Ilic, Ognjen; Bermel, Peter; Celanovic, Ivan; Joannopoulos, John D; Soljačić, Marin; Johnson, Steven G

    2011-09-09

    We demonstrate the possibility of achieving enhanced frequency-selective near-field radiative heat transfer between patterned (photonic-crystal) slabs at designable frequencies and separations, exploiting a general numerical approach for computing heat transfer in arbitrary geometries and materials based on the finite-difference time-domain method. Our simulations reveal a tradeoff between selectivity and near-field enhancement as the slab-slab separation decreases, with the patterned heat transfer eventually reducing to the unpatterned result multiplied by a fill factor (described by a standard proximity approximation). We also find that heat transfer can be further enhanced at selective frequencies when the slabs are brought into a glide-symmetric configuration, a consequence of the degeneracies associated with the nonsymmorphic symmetry group.

  16. Simulation Applications in Educational Leadership.

    ERIC Educational Resources Information Center

    Bozeman, William; Wright, Robert H.

    1995-01-01

    Explores the use of computer-based simulations using multimedia materials for a graduate course in school administration. Highlights include simulation applications in military and in business; educational simulations; the use of computers and other technology; production requirements and costs; and time required. (LRW)

  17. On the "Exchangeability" of Hands-On and Computer-Simulated Science Performance Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Rosenquist, Anders; Shavelson, Richard J.; Ruiz-Primo, Maria Araceli

    Inconsistencies in scores from computer-simulated and "hands-on" science performance assessments have led to questions about the exchangeability of these two methods in spite of the highly touted potential of computer-simulated performance assessment. This investigation considered possible explanations for students' inconsistent performances: (1)…

  18. Using PC Software To Enhance the Student's Ability To Learn the Exporting Process.

    ERIC Educational Resources Information Center

    Buckles, Tom A.; Lange, Irene

    This paper describes the advantages of using computer simulations in the classroom or managerial environment and the major premise and principal components of Export to Win!, a computer simulation used in international marketing seminars. A rationale for using computer simulations argues that they improve the quality of teaching by building…

  19. Unpacking Students' Conceptualizations through Haptic Feedback

    ERIC Educational Resources Information Center

    Magana, A. J.; Balachandran, S.

    2017-01-01

    While it is clear that the use of computer simulations has a beneficial effect on learning when compared to instruction without computer simulations, there is still room for improvement to fully realize their benefits for learning. Haptic technologies can fulfill the educational potential of computer simulations by adding the sense of touch.…

  20. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    NASA Astrophysics Data System (ADS)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  1. QCE: A Simulator for Quantum Computer Hardware

    NASA Astrophysics Data System (ADS)

    Michielsen, Kristel; de Raedt, Hans

    2003-09-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms. QCE runs in a Windows 98/NT/2000/ME/XP environment. It can be used to validate designs of physically realizable quantum processors and as an interactive educational tool to learn about quantum computers and quantum algorithms. A detailed exposition is given of the implementation of the CNOT and the Toffoli gate, the quantum Fourier transform, Grover's database search algorithm, an order finding algorithm, Shor's algorithm, a three-input adder and a number partitioning algorithm. We also review the results of simulations of an NMR-like quantum computer.

  2. Realistic Modeling of Interaction of Quiet-Sun Magnetic Fields with the Chromosphere

    NASA Technical Reports Server (NTRS)

    Kitiashvili, I. N.; Kosovichev, A. G.; Mansour, N. N.; Wray, A. A.

    2017-01-01

    High­-resolution observations and 3D MHD simulations reveal intense interaction between the convection zone dynamics and the solar atmosphere on subarcsecond scales. To investigate processes of the dynamical coupling and energy exchange between the subsurface layers and the chromosphere we perform 3D radiative MHD modeling for a computational domain that includes the upper convection zone and the chromosphere, and investigate the structure and dynamics for different intensity of the photospheric magnetic flux. For comparison with observations, the simulation models have been used to calculate synthetic Stokes profiles of various spectral lines. The results show intense energy exchange through small­-scale magnetized vortex tubes rooted below the photosphere, which provide extra heating of the chromosphere, initiate shock waves, and small­-scale eruptions.

  3. [The research on bidirectional reflectance computer simulation of forest canopy at pixel scale].

    PubMed

    Song, Jin-Ling; Wang, Jin-Di; Shuai, Yan-Min; Xiao, Zhi-Qiang

    2009-08-01

    Computer simulation is based on computer graphics to generate the realistic 3D structure scene of vegetation, and to simulate the canopy regime using radiosity method. In the present paper, the authors expand the computer simulation model to simulate forest canopy bidirectional reflectance at pixel scale. But usually, the trees are complex structures, which are tall and have many branches. So there is almost a need for hundreds of thousands or even millions of facets to built up the realistic structure scene for the forest It is difficult for the radiosity method to compute so many facets. In order to make the radiosity method to simulate the forest scene at pixel scale, in the authors' research, the authors proposed one idea to simplify the structure of forest crowns, and abstract the crowns to ellipsoids. And based on the optical characteristics of the tree component and the characteristics of the internal energy transmission of photon in real crown, the authors valued the optical characteristics of ellipsoid surface facets. In the computer simulation of the forest, with the idea of geometrical optics model, the gap model is considered to get the forest canopy bidirectional reflectance at pixel scale. Comparing the computer simulation results with the GOMS model, and Multi-angle Imaging SpectroRadiometer (MISR) multi-angle remote sensing data, the simulation results are in agreement with the GOMS simulation result and MISR BRF. But there are also some problems to be solved. So the authors can conclude that the study has important value for the application of multi-angle remote sensing and the inversion of vegetation canopy structure parameters.

  4. Learning from avatars: Learning assistants practice physics pedagogy in a classroom simulator

    NASA Astrophysics Data System (ADS)

    Chini, Jacquelyn J.; Straub, Carrie L.; Thomas, Kevin H.

    2016-06-01

    [This paper is part of the Focused Collection on Preparing and Supporting University Physics Educators.] Undergraduate students are increasingly being used to support course transformations that incorporate research-based instructional strategies. While such students are typically selected based on strong content knowledge and possible interest in teaching, they often do not have previous pedagogical training. The current training models make use of real students or classmates role playing as students as the test subjects. We present a new environment for facilitating the practice of physics pedagogy skills, a highly immersive mixed-reality classroom simulator, and assess its effectiveness for undergraduate physics learning assistants (LAs). LAs prepared, taught, and reflected on a lesson about motion graphs for five highly interactive computer generated student avatars in the mixed-reality classroom simulator. To assess the effectiveness of the simulator for this population, we analyzed the pedagogical skills LAs intended to practice and exhibited during their lessons and explored LAs' descriptions of their experiences with the simulator. Our results indicate that the classroom simulator created a safe, effective environment for LAs to practice a variety of skills, such as questioning styles and wait time. Additionally, our analysis revealed areas for improvement in our preparation of LAs and use of the simulator. We conclude with a summary of research questions this environment could facilitate.

  5. Organometallic chemical vapor deposition of silicon nitride films enhanced by atomic nitrogen generated from surface-wave plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okada, H.; Kato, M.; Ishimaru, T.

    2014-02-20

    Organometallic chemical vapor deposition of silicon nitride films enhanced by atomic nitrogen generated from surface-wave plasma is investigated. Feasibility of precursors of triethylsilane (TES) and bis(dimethylamino)dimethylsilane (BDMADMS) is discussed based on a calculation of bond energies by computer simulation. Refractive indices of 1.81 and 1.71 are obtained for deposited films with TES and BDMADMS, respectively. X-ray photoelectron spectroscopy (XPS) analysis of the deposited film revealed that TES-based film coincides with the stoichiometric thermal silicon nitride.

  6. Computational analysis of heat transfer, thermal stress and dislocation density during resistively Czochralski growth of germanium single crystal

    NASA Astrophysics Data System (ADS)

    Tavakoli, Mohammad Hossein; Renani, Elahe Kabiri; Honarmandnia, Mohtaram; Ezheiyan, Mahdi

    2018-02-01

    In this paper, a set of numerical simulations of fluid flow, temperature gradient, thermal stress and dislocation density for a Czochralski setup used to grow IR optical-grade Ge single crystal have been done for different stages of the growth process. A two-dimensional steady state finite element method has been applied for all calculations. The obtained numerical results reveal that the thermal field, thermal stress and dislocation structure are mainly dependent on the crystal height, heat radiation and gas flow in the growth system.

  7. Dynamics of Oxidation of Aluminum Nanoclusters using Variable Charge Molecular-Dynamics Simulations on Parallel Computers

    NASA Astrophysics Data System (ADS)

    Campbell, Timothy; Kalia, Rajiv K.; Nakano, Aiichiro; Vashishta, Priya; Ogata, Shuji; Rodgers, Stephen

    1999-06-01

    Oxidation of aluminum nanoclusters is investigated with a parallel molecular-dynamics approach based on dynamic charge transfer among atoms. Structural and dynamic correlations reveal that significant charge transfer gives rise to large negative pressure in the oxide which dominates the positive pressure due to steric forces. As a result, aluminum moves outward and oxygen moves towards the interior of the cluster with the aluminum diffusivity 60% higher than that of oxygen. A stable 40 Å thick amorphous oxide is formed; this is in excellent agreement with experiments.

  8. Comparison of Building Loads Analysis and System Thermodynamics (BLAST) Computer Program Simulations and Measured Energy Use for Army Buildings.

    DTIC Science & Technology

    1980-05-01

    engineering ,ZteNo D R RPTE16 research w 9 laboratory COMPARISON OF BUILDING LOADS ANALYSIS AND SYSTEM THERMODYNAMICS (BLAST) AD 0 5 5,0 3COMPUTER PROGRAM...Building Loads Analysis and System Thermodynamics (BLAST) computer program. A dental clinic and a battalion headquarters and classroom building were...Building and HVAC System Data Computer Simulation Comparison of Actual and Simulated Results ANALYSIS AND FINDINGS

  9. The Ghost of Computers Past, Present, and Future: Computer Use for Preservice/Inservice Reading Programs.

    ERIC Educational Resources Information Center

    Prince, Amber T.

    Computer assisted instruction, and especially computer simulations, can help to ensure that preservice and inservice teachers learn from the right experiences. In the past, colleges of education used large mainframe computer systems to store student registration, provide simulation lessons on diagnosing reading difficulties, construct informal…

  10. In silico modelling of drug–polymer interactions for pharmaceutical formulations

    PubMed Central

    Ahmad, Samina; Johnston, Blair F.; Mackay, Simon P.; Schatzlein, Andreas G.; Gellert, Paul; Sengupta, Durba; Uchegbu, Ijeoma F.

    2010-01-01

    Selecting polymers for drug encapsulation in pharmaceutical formulations is usually made after extensive trial and error experiments. To speed up excipient choice procedures, we have explored coarse-grained computer simulations (dissipative particle dynamics (DPD) and coarse-grained molecular dynamics using the MARTINI force field) of polymer–drug interactions to study the encapsulation of prednisolone (log p = 1.6), paracetamol (log p = 0.3) and isoniazid (log p = −1.1) in poly(l-lactic acid) (PLA) controlled release microspheres, as well as the encapsulation of propofol (log p = 4.1) in bioavailability enhancing quaternary ammonium palmitoyl glycol chitosan (GCPQ) micelles. Simulations have been compared with experimental data. DPD simulations, in good correlation with experimental data, correctly revealed that hydrophobic drugs (prednisolone and paracetamol) could be encapsulated within PLA microspheres and predicted the experimentally observed paracetamol encapsulation levels (5–8% of the initial drug level) in 50 mg ml−1 PLA microspheres, but only when initial paracetamol levels exceeded 5 mg ml−1. However, the mesoscale technique was unable to model the hydrophilic drug (isoniazid) encapsulation (4–9% of the initial drug level) which was observed in experiments. Molecular dynamics simulations using the MARTINI force field indicated that the self-assembly of GCPQ is rapid, with propofol residing at the interface between micellar hydrophobic and hydrophilic groups, and that there is a heterogeneous distribution of propofol within the GCPQ micelle population. GCPQ–propofol experiments also revealed a population of relatively empty and drug-filled GCPQ particles. PMID:20519214

  11. Empirical evidence for musical syntax processing? Computer simulations reveal the contribution of auditory short-term memory

    PubMed Central

    Bigand, Emmanuel; Delbé, Charles; Poulin-Charronnat, Bénédicte; Leman, Marc; Tillmann, Barbara

    2014-01-01

    During the last decade, it has been argued that (1) music processing involves syntactic representations similar to those observed in language, and (2) that music and language share similar syntactic-like processes and neural resources. This claim is important for understanding the origin of music and language abilities and, furthermore, it has clinical implications. The Western musical system, however, is rooted in psychoacoustic properties of sound, and this is not the case for linguistic syntax. Accordingly, musical syntax processing could be parsimoniously understood as an emergent property of auditory memory rather than a property of abstract processing similar to linguistic processing. To support this view, we simulated numerous empirical studies that investigated the processing of harmonic structures, using a model based on the accumulation of sensory information in auditory memory. The simulations revealed that most of the musical syntax manipulations used with behavioral and neurophysiological methods as well as with developmental and cross-cultural approaches can be accounted for by the auditory memory model. This led us to question whether current research on musical syntax can really be compared with linguistic processing. Our simulation also raises methodological and theoretical challenges to study musical syntax while disentangling the confounded low-level sensory influences. In order to investigate syntactic abilities in music comparable to language, research should preferentially use musical material with structures that circumvent the tonal effect exerted by psychoacoustic properties of sounds. PMID:24936174

  12. Parallel computing method for simulating hydrological processesof large rivers under climate change

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.

    2016-12-01

    Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.

  13. Computer simulation of space charge

    NASA Astrophysics Data System (ADS)

    Yu, K. W.; Chung, W. K.; Mak, S. S.

    1991-05-01

    Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.

  14. Computer simulation results of attitude estimation of earth orbiting satellites

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1976-01-01

    Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

  15. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  16. Modeling complex and multi-component food systems in molecular dynamics simulations on the example of chocolate conching.

    PubMed

    Greiner, Maximilian; Sonnleitner, Bettina; Mailänder, Markus; Briesen, Heiko

    2014-02-01

    Additional benefits of foods are an increasing factor in the consumer's purchase. To produce foods with the properties the consumer demands, understanding the micro- and nanostructure is becoming more important in food research today. We present molecular dynamics (MD) simulations as a tool to study complex and multi-component food systems on the example of chocolate conching. The process of conching is chosen because of the interesting challenges it provides: the components (fats, emulsifiers and carbohydrates) contain diverse functional groups, are naturally fluctuating in their chemical composition, and have a high number of internal degrees of freedom. Further, slow diffusion in the non-aqueous medium is expected. All of these challenges are typical to food systems in general. Simulation results show the suitability of present force fields to correctly model the liquid and crystal density of cocoa butter and sucrose, respectively. Amphiphilic properties of emulsifiers are observed by micelle formation in water. For non-aqueous media, pulling simulations reveal high energy barriers for motion in the viscous cocoa butter. The work for detachment of an emulsifier from the sucrose crystal is calculated and matched with detachment of the head and tail groups separately. Hydrogen bonding is shown to be the dominant interaction between the emulsifier and the crystal surface. Thus, MD simulations are suited to model the interaction between the emulsifier and sugar crystal interface in non-aqueous media, revealing detailed information about the structuring and interactions on a molecular level. With interaction parameters being available for a wide variety of chemical groups, MD simulations are a valuable tool to understand complex and multi-component food systems in general. MD simulations provide a substantial benefit to researchers to verify their hypothesis in dynamic simulations with an atomistic resolution. Rapid rise of computational resources successively increases the complexity and the size of the systems that can be studied.

  17. Computer Simulation of the Circulation Subsystem of a Library

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  18. Progress in Unsteady Turbopump Flow Simulations

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin C.; Chan, William; Kwak, Dochan; Williams, Robert

    2002-01-01

    This viewgraph presentation discusses unsteady flow simulations for a turbopump intended for a reusable launch vehicle (RLV). The simulation process makes use of computational grids and parallel processing. The architecture of the parallel computers used is discussed, as is the scripting of turbopump simulations.

  19. Numerical simulation of steady and unsteady viscous flow in turbomachinery using pressure based algorithm

    NASA Astrophysics Data System (ADS)

    Lakshminarayana, B.; Ho, Y.; Basson, A.

    1993-07-01

    The objective of this research is to simulate steady and unsteady viscous flows, including rotor/stator interaction and tip clearance effects in turbomachinery. The numerical formulation for steady flow developed here includes an efficient grid generation scheme, particularly suited to computational grids for the analysis of turbulent turbomachinery flows and tip clearance flows, and a semi-implicit, pressure-based computational fluid dynamics scheme that directly includes artificial dissipation, and is applicable to both viscous and inviscid flows. The values of these artificial dissipation is optimized to achieve accuracy and convergency in the solution. The numerical model is used to investigate the structure of tip clearance flows in a turbine nozzle. The structure of leakage flow is captured accurately, including blade-to-blade variation of all three velocity components, pitch and yaw angles, losses and blade static pressures in the tip clearance region. The simulation also includes evaluation of such quantities of leakage mass flow, vortex strength, losses, dominant leakage flow regions and the spanwise extent affected by the leakage flow. It is demonstrated, through optimization of grid size and artificial dissipation, that the tip clearance flow field can be captured accurately. The above numerical formulation was modified to incorporate time accurate solutions. An inner loop iteration scheme is used at each time step to account for the non-linear effects. The computation of unsteady flow through a flat plate cascade subjected to a transverse gust reveals that the choice of grid spacing and the amount of artificial dissipation is critical for accurate prediction of unsteady phenomena. The rotor-stator interaction problem is simulated by starting the computation upstream of the stator, and the upstream rotor wake is specified from the experimental data. The results show that the stator potential effects have appreciable influence on the upstream rotor wake. The predicted unsteady wake profiles are compared with the available experimental data and the agreement is good. The numerical results are interpreted to draw conclusions on the unsteady wake transport mechanism in the blade passage.

  20. Physics Computing '92: Proceedings of the 4th International Conference

    NASA Astrophysics Data System (ADS)

    de Groot, Robert A.; Nadrchal, Jaroslav

    1993-04-01

    The Table of Contents for the book is as follows: * Preface * INVITED PAPERS * Ab Initio Theoretical Approaches to the Structural, Electronic and Vibrational Properties of Small Clusters and Fullerenes: The State of the Art * Neural Multigrid Methods for Gauge Theories and Other Disordered Systems * Multicanonical Monte Carlo Simulations * On the Use of the Symbolic Language Maple in Physics and Chemistry: Several Examples * Nonequilibrium Phase Transitions in Catalysis and Population Models * Computer Algebra, Symmetry Analysis and Integrability of Nonlinear Evolution Equations * The Path-Integral Quantum Simulation of Hydrogen in Metals * Digital Optical Computing: A New Approach of Systolic Arrays Based on Coherence Modulation of Light and Integrated Optics Technology * Molecular Dynamics Simulations of Granular Materials * Numerical Implementation of a K.A.M. Algorithm * Quasi-Monte Carlo, Quasi-Random Numbers and Quasi-Error Estimates * What Can We Learn from QMC Simulations * Physics of Fluctuating Membranes * Plato, Apollonius, and Klein: Playing with Spheres * Steady States in Nonequilibrium Lattice Systems * CONVODE: A REDUCE Package for Differential Equations * Chaos in Coupled Rotators * Symplectic Numerical Methods for Hamiltonian Problems * Computer Simulations of Surfactant Self Assembly * High-dimensional and Very Large Cellular Automata for Immunological Shape Space * A Review of the Lattice Boltzmann Method * Electronic Structure of Solids in the Self-interaction Corrected Local-spin-density Approximation * Dedicated Computers for Lattice Gauge Theory Simulations * Physics Education: A Survey of Problems and Possible Solutions * Parallel Computing and Electronic-Structure Theory * High Precision Simulation Techniques for Lattice Field Theory * CONTRIBUTED PAPERS * Case Study of Microscale Hydrodynamics Using Molecular Dynamics and Lattice Gas Methods * Computer Modelling of the Structural and Electronic Properties of the Supported Metal Catalysis * Ordered Particle Simulations for Serial and MIMD Parallel Computers * "NOLP" -- Program Package for Laser Plasma Nonlinear Optics * Algorithms to Solve Nonlinear Least Square Problems * Distribution of Hydrogen Atoms in Pd-H Computed by Molecular Dynamics * A Ray Tracing of Optical System for Protein Crystallography Beamline at Storage Ring-SIBERIA-2 * Vibrational Properties of a Pseudobinary Linear Chain with Correlated Substitutional Disorder * Application of the Software Package Mathematica in Generalized Master Equation Method * Linelist: An Interactive Program for Analysing Beam-foil Spectra * GROMACS: A Parallel Computer for Molecular Dynamics Simulations * GROMACS Method of Virial Calculation Using a Single Sum * The Interactive Program for the Solution of the Laplace Equation with the Elimination of Singularities for Boundary Functions * Random-Number Generators: Testing Procedures and Comparison of RNG Algorithms * Micro-TOPIC: A Tokamak Plasma Impurities Code * Rotational Molecular Scattering Calculations * Orthonormal Polynomial Method for Calibrating of Cryogenic Temperature Sensors * Frame-based System Representing Basis of Physics * The Role of Massively Data-parallel Computers in Large Scale Molecular Dynamics Simulations * Short-range Molecular Dynamics on a Network of Processors and Workstations * An Algorithm for Higher-order Perturbation Theory in Radiative Transfer Computations * Hydrostochastics: The Master Equation Formulation of Fluid Dynamics * HPP Lattice Gas on Transputers and Networked Workstations * Study on the Hysteresis Cycle Simulation Using Modeling with Different Functions on Intervals * Refined Pruning Techniques for Feed-forward Neural Networks * Random Walk Simulation of the Motion of Transient Charges in Photoconductors * The Optical Hysteresis in Hydrogenated Amorphous Silicon * Diffusion Monte Carlo Analysis of Modern Interatomic Potentials for He * A Parallel Strategy for Molecular Dynamics Simulations of Polar Liquids on Transputer Arrays * Distribution of Ions Reflected on Rough Surfaces * The Study of Step Density Distribution During Molecular Beam Epitaxy Growth: Monte Carlo Computer Simulation * Towards a Formal Approach to the Construction of Large-scale Scientific Applications Software * Correlated Random Walk and Discrete Modelling of Propagation through Inhomogeneous Media * Teaching Plasma Physics Simulation * A Theoretical Determination of the Au-Ni Phase Diagram * Boson and Fermion Kinetics in One-dimensional Lattices * Computational Physics Course on the Technical University * Symbolic Computations in Simulation Code Development and Femtosecond-pulse Laser-plasma Interaction Studies * Computer Algebra and Integrated Computing Systems in Education of Physical Sciences * Coordinated System of Programs for Undergraduate Physics Instruction * Program Package MIRIAM and Atomic Physics of Extreme Systems * High Energy Physics Simulation on the T_Node * The Chapman-Kolmogorov Equation as Representation of Huygens' Principle and the Monolithic Self-consistent Numerical Modelling of Lasers * Authoring System for Simulation Developments * Molecular Dynamics Study of Ion Charge Effects in the Structure of Ionic Crystals * A Computational Physics Introductory Course * Computer Calculation of Substrate Temperature Field in MBE System * Multimagnetical Simulation of the Ising Model in Two and Three Dimensions * Failure of the CTRW Treatment of the Quasicoherent Excitation Transfer * Implementation of a Parallel Conjugate Gradient Method for Simulation of Elastic Light Scattering * Algorithms for Study of Thin Film Growth * Algorithms and Programs for Physics Teaching in Romanian Technical Universities * Multicanonical Simulation of 1st order Transitions: Interface Tension of the 2D 7-State Potts Model * Two Numerical Methods for the Calculation of Periodic Orbits in Hamiltonian Systems * Chaotic Behavior in a Probabilistic Cellular Automata? * Wave Optics Computing by a Networked-based Vector Wave Automaton * Tensor Manipulation Package in REDUCE * Propagation of Electromagnetic Pulses in Stratified Media * The Simple Molecular Dynamics Model for the Study of Thermalization of the Hot Nucleon Gas * Electron Spin Polarization in PdCo Alloys Calculated by KKR-CPA-LSD Method * Simulation Studies of Microscopic Droplet Spreading * A Vectorizable Algorithm for the Multicolor Successive Overrelaxation Method * Tetragonality of the CuAu I Lattice and Its Relation to Electronic Specific Heat and Spin Susceptibility * Computer Simulation of the Formation of Metallic Aggregates Produced by Chemical Reactions in Aqueous Solution * Scaling in Growth Models with Diffusion: A Monte Carlo Study * The Nucleus as the Mesoscopic System * Neural Network Computation as Dynamic System Simulation * First-principles Theory of Surface Segregation in Binary Alloys * Data Smooth Approximation Algorithm for Estimating the Temperature Dependence of the Ice Nucleation Rate * Genetic Algorithms in Optical Design * Application of 2D-FFT in the Study of Molecular Exchange Processes by NMR * Advanced Mobility Model for Electron Transport in P-Si Inversion Layers * Computer Simulation for Film Surfaces and its Fractal Dimension * Parallel Computation Techniques and the Structure of Catalyst Surfaces * Educational SW to Teach Digital Electronics and the Corresponding Text Book * Primitive Trinomials (Mod 2) Whose Degree is a Mersenne Exponent * Stochastic Modelisation and Parallel Computing * Remarks on the Hybrid Monte Carlo Algorithm for the ∫4 Model * An Experimental Computer Assisted Workbench for Physics Teaching * A Fully Implicit Code to Model Tokamak Plasma Edge Transport * EXPFIT: An Interactive Program for Automatic Beam-foil Decay Curve Analysis * Mapping Technique for Solving General, 1-D Hamiltonian Systems * Freeway Traffic, Cellular Automata, and Some (Self-Organizing) Criticality * Photonuclear Yield Analysis by Dynamic Programming * Incremental Representation of the Simply Connected Planar Curves * Self-convergence in Monte Carlo Methods * Adaptive Mesh Technique for Shock Wave Propagation * Simulation of Supersonic Coronal Streams and Their Interaction with the Solar Wind * The Nature of Chaos in Two Systems of Ordinary Nonlinear Differential Equations * Considerations of a Window-shopper * Interpretation of Data Obtained by RTP 4-Channel Pulsed Radar Reflectometer Using a Multi Layer Perceptron * Statistics of Lattice Bosons for Finite Systems * Fractal Based Image Compression with Affine Transformations * Algorithmic Studies on Simulation Codes for Heavy-ion Reactions * An Energy-Wise Computer Simulation of DNA-Ion-Water Interactions Explains the Abnormal Structure of Poly[d(A)]:Poly[d(T)] * Computer Simulation Study of Kosterlitz-Thouless-Like Transitions * Problem-oriented Software Package GUN-EBT for Computer Simulation of Beam Formation and Transport in Technological Electron-Optical Systems * Parallelization of a Boundary Value Solver and its Application in Nonlinear Dynamics * The Symbolic Classification of Real Four-dimensional Lie Algebras * Short, Singular Pulses Generation by a Dye Laser at Two Wavelengths Simultaneously * Quantum Monte Carlo Simulations of the Apex-Oxygen-Model * Approximation Procedures for the Axial Symmetric Static Einstein-Maxwell-Higgs Theory * Crystallization on a Sphere: Parallel Simulation on a Transputer Network * FAMULUS: A Software Product (also) for Physics Education * MathCAD vs. FAMULUS -- A Brief Comparison * First-principles Dynamics Used to Study Dissociative Chemisorption * A Computer Controlled System for Crystal Growth from Melt * A Time Resolved Spectroscopic Method for Short Pulsed Particle Emission * Green's Function Computation in Radiative Transfer Theory * Random Search Optimization Technique for One-criteria and Multi-criteria Problems * Hartley Transform Applications to Thermal Drift Elimination in Scanning Tunneling Microscopy * Algorithms of Measuring, Processing and Interpretation of Experimental Data Obtained with Scanning Tunneling Microscope * Time-dependent Atom-surface Interactions * Local and Global Minima on Molecular Potential Energy Surfaces: An Example of N3 Radical * Computation of Bifurcation Surfaces * Symbolic Computations in Quantum Mechanics: Energies in Next-to-solvable Systems * A Tool for RTP Reactor and Lamp Field Design * Modelling of Particle Spectra for the Analysis of Solid State Surface * List of Participants

Top