Sample records for computer simulations suggest

  1. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    ERIC Educational Resources Information Center

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  2. Modeling Education on the Real World.

    ERIC Educational Resources Information Center

    Hunter, Beverly

    1983-01-01

    Offers and discusses three suggestions to capitalize on two developments related to system dynamics modeling and simulation. These developments are a junior/senior high textbook called "Introduction to Computer Simulation" and Micro-DYNAMO, a computer simulation language for microcomputers. (Author/JN)

  3. Massively parallel quantum computer simulator

    NASA Astrophysics Data System (ADS)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.

  4. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.

  5. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  6. A study of workstation computational performance for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  7. Longitudinal train dynamics: an overview

    NASA Astrophysics Data System (ADS)

    Wu, Qing; Spiryagin, Maksym; Cole, Colin

    2016-12-01

    This paper discusses the evolution of longitudinal train dynamics (LTD) simulations, which covers numerical solvers, vehicle connection systems, air brake systems, wagon dumper systems and locomotives, resistance forces and gravitational components, vehicle in-train instabilities, and computing schemes. A number of potential research topics are suggested, such as modelling of friction, polymer, and transition characteristics for vehicle connection simulations, studies of wagon dumping operations, proper modelling of vehicle in-train instabilities, and computing schemes for LTD simulations. Evidence shows that LTD simulations have evolved with computing capabilities. Currently, advanced component models that directly describe the working principles of the operation of air brake systems, vehicle connection systems, and traction systems are available. Parallel computing is a good solution to combine and simulate all these advanced models. Parallel computing can also be used to conduct three-dimensional long train dynamics simulations.

  8. Reverse logistics system planning for recycling computers hardware: A case study

    NASA Astrophysics Data System (ADS)

    Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar

    2014-09-01

    This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.

  9. Quantum chemistry simulation on quantum computers: theories and experiments.

    PubMed

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  10. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  11. Social Choice in a Computer-Assisted Simulation

    ERIC Educational Resources Information Center

    Thavikulwat, Precha

    2009-01-01

    Pursuing a line of inquiry suggested by Crookall, Martin, Saunders, and Coote, the author applied, within the framework of design science, an optimal-design approach to incorporate into a computer-assisted simulation two innovative social choice processes: the multiple period double auction and continuous voting. Expectations that the…

  12. Computer Simulation in Tomorrow's Schools.

    ERIC Educational Resources Information Center

    Foster, David

    1984-01-01

    Suggests use of simulation as an educational strategy has promise for the school of the future; discusses specific advantages of simulations over alternative educational methods, role of microcomputers in educational simulation, and past obstacles and future promise of microcomputer simulations; and presents a literature review on effectiveness of…

  13. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  14. A Study of the Efficacy of Project-Based Learning Integrated with Computer-Based Simulation--STELLA

    ERIC Educational Resources Information Center

    Eskrootchi, Rogheyeh; Oskrochi, G. Reza

    2010-01-01

    Incorporating computer-simulation modelling into project-based learning may be effective but requires careful planning and implementation. Teachers, especially, need pedagogical content knowledge which refers to knowledge about how students learn from materials infused with technology. This study suggests that students learn best by actively…

  15. ENZVU--An Enzyme Kinetics Computer Simulation Based upon a Conceptual Model of Enzyme Action.

    ERIC Educational Resources Information Center

    Graham, Ian

    1985-01-01

    Discusses a simulation on enzyme kinetics based upon the ability of computers to generate random numbers. The program includes: (1) enzyme catalysis in a restricted two-dimensional grid; (2) visual representation of catalysis; and (3) storage and manipulation of data. Suggested applications and conclusions are also discussed. (DH)

  16. How Children Solve Environmental Problems: Using Computer Simulations To Investigate Systems Thinking.

    ERIC Educational Resources Information Center

    Sheehy, N. P.; Wylie, J. W.; McGuinness, C.; Orchard, G.

    2000-01-01

    Describes the development and use of two computer simulations for investigating systems thinking and environmental problem-solving in children (n=92). Finds that older children outperformed younger children, who tended to exhibit magical thinking. Suggests that seemingly isomorphic environmental problems may not be interpreted as such by children.…

  17. Two-step simulation of velocity and passive scalar mixing at high Schmidt number in turbulent jets

    NASA Astrophysics Data System (ADS)

    Rah, K. Jeff; Blanquart, Guillaume

    2016-11-01

    Simulation of passive scalar in the high Schmidt number turbulent mixing process requires higher computational cost than that of velocity fields, because the scalar is associated with smaller length scales than velocity. Thus, full simulation of both velocity and passive scalar with high Sc for a practical configuration is difficult to perform. In this work, a new approach to simulate velocity and passive scalar mixing at high Sc is suggested to reduce the computational cost. First, the velocity fields are resolved by Large Eddy Simulation (LES). Then, by extracting the velocity information from LES, the scalar inside a moving fluid blob is simulated by Direct Numerical Simulation (DNS). This two-step simulation method is applied to a turbulent jet and provides a new way to examine a scalar mixing process in a practical application with smaller computational cost. NSF, Samsung Scholarship.

  18. Finally, a Good Way to Teach City Government! A Review of the Computer Simulation Game "SimCity."

    ERIC Educational Resources Information Center

    Pahl, Ronald H.

    1991-01-01

    Offers an evaluation of the computer simulation game "SimCity." Suggests possible uses for the game at different age and experience levels. Recommends the program as challenging, humorous, and an excellent aid in teaching about the problems and solutions facing city government. Explains that students serve as public officials. (DK)

  19. A Simple Method for Automated Equilibration Detection in Molecular Simulations.

    PubMed

    Chodera, John D

    2016-04-12

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data.

  20. A simple method for automated equilibration detection in molecular simulations

    PubMed Central

    Chodera, John D.

    2016-01-01

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest, in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure, and demonstrate its utility on typical molecular simulation data. PMID:26771390

  1. Core Binding Site of a Thioflavin-T-Derived Imaging Probe on Amyloid β Fibrils Predicted by Computational Methods.

    PubMed

    Kawai, Ryoko; Araki, Mitsugu; Yoshimura, Masashi; Kamiya, Narutoshi; Ono, Masahiro; Saji, Hideo; Okuno, Yasushi

    2018-05-16

    Development of new diagnostic imaging probes for Alzheimer's disease, such as positron emission tomography (PET) and single photon emission computed tomography (SPECT) probes, has been strongly desired. In this study, we investigated the most accessible amyloid β (Aβ) binding site of [ 123 I]IMPY, a Thioflavin-T-derived SPECT probe, using experimental and computational methods. First, we performed a competitive inhibition assay with Orange-G, which recognizes the KLVFFA region in Aβ fibrils, suggesting that IMPY and Orange-G bind to different sites in Aβ fibrils. Next, we precisely predicted the IMPY binding site on a multiple-protofilament Aβ fibril model using computational approaches, consisting of molecular dynamics and docking simulations. We generated possible IMPY-binding structures using docking simulations to identify candidates for probe-binding sites. The binding free energy of IMPY with the Aβ fibril was calculated by a free energy simulation method, MP-CAFEE. These computational results suggest that IMPY preferentially binds to an interfacial pocket located between two protofilaments and is stabilized mainly through hydrophobic interactions. Finally, our computational approach was validated by comparing it with the experimental results. The present study demonstrates the possibility of computational approaches to screen new PET/SPECT probes for Aβ imaging.

  2. Interacting with a Computer-Simulated Pet: Factors Influencing Children's Humane Attitudes and Empathy

    ERIC Educational Resources Information Center

    Tsai, Yueh-Feng; Kaufman, David

    2014-01-01

    Previous research by Tsai and Kaufman (2010a, 2010b) has suggested that computer-simulated virtual pet dogs can be used as a potential medium to enhance children's development of empathy and humane attitudes toward animals. To gain a deeper understanding of how and why interacting with a virtual pet dog might influence children's social and…

  3. GATE Monte Carlo simulation in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  4. Real-time simulation of a spiking neural network model of the basal ganglia circuitry using general purpose computing on graphics processing units.

    PubMed

    Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi

    2011-11-01

    Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Fractals: To Know, to Do, to Simulate.

    ERIC Educational Resources Information Center

    Talanquer, Vicente; Irazoque, Glinda

    1993-01-01

    Discusses the development of fractal theory and suggests fractal aggregates as an attractive alternative for introducing fractal concepts. Describes methods for producing metallic fractals and a computer simulation for drawing fractals. (MVL)

  6. Instructional support and implementation structure during elementary teachers' science education simulation use

    NASA Astrophysics Data System (ADS)

    Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.

    2016-07-01

    This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results indicated teachers used a one-to-one student-to-computer ratio most often either during class-wide individual computer use or during a rotating station structure. Worksheets, general support, and peer collaboration were the most common forms of instructional support. The least common instructional support forms included lesson pacing, initial play, and a closure discussion. Students' simulation use was supported in the fewest ways during a rotating station structure. Results suggest that simulation professional development with elementary teachers needs to explicitly focus on implementation structures and instructional support to enhance participants' pedagogical knowledge and improve instructional simulation use. In addition, research is needed to provide theoretical explanations for the observed patterns that should subsequently be addressed in supporting teachers' instructional simulation use during professional development or in teacher preparation programs.

  7. A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation

    PubMed Central

    Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas

    2011-01-01

    High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089

  8. STEPS: A Simulated, Tutorable Physics Student.

    ERIC Educational Resources Information Center

    Ur, Sigalit; VanLehn, Kurt

    1995-01-01

    Describes a simulated student that learns by interacting with a human tutor. Tests suggest that simulated students, when developed past the prototype stage, could be valuable for training human tutors. Provides a computational cognitive task analysis of the skill of learning from a tutor that is useful for designing intelligent tutoring systems.…

  9. Computer Simulation for Pain Management Education: A Pilot Study.

    PubMed

    Allred, Kelly; Gerardi, Nicole

    2017-10-01

    Effective pain management is an elusive concept in acute care. Inadequate knowledge has been identified as a barrier to providing optimal pain management. This study aimed to determine student perceptions of an interactive computer simulation as a potential method for learning pain management, as a motivator to read and learn more about pain management, preference over traditional lecture, and its potential to change nursing practice. A post-computer simulation survey with a mixed-methods descriptive design was used in this study. A college of nursing in a large metropolitan university in the Southeast United States. A convenience sample of 30 nursing students in a Bachelor of Science nursing program. An interactive computer simulation was developed as a potential alternative method of teaching pain management to nursing students. Increases in educational gain as well as its potential to change practice were explored. Each participant was asked to complete a survey consisting of 10 standard 5-point Likert scale items and 5 open-ended questions. The survey was used to evaluate the students' perception of the simulation, specifically related to educational benefit, preference compared with traditional teaching methods, and perceived potential to change nursing practice. Data provided descriptive statistics for initial evaluation of the computer simulation. The responses on the survey suggest nursing students perceive the computer simulation to be entertaining, fun, educational, occasionally preferred over regular lecture, and with potential to change practice. Preliminary data support the use of computer simulation in educating nursing students about pain management. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  10. Integrating Computational Science Tools into a Thermodynamics Course

    NASA Astrophysics Data System (ADS)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  11. Simulation of a master-slave event set processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Comfort, J.C.

    1984-03-01

    Event set manipulation may consume a considerable amount of the computation time spent in performing a discrete-event simulation. One way of minimizing this time is to allow event set processing to proceed in parallel with the remainder of the simulation computation. The paper describes a multiprocessor simulation computer, in which all non-event set processing is performed by the principal processor (called the host). Event set processing is coordinated by a front end processor (the master) and actually performed by several other functionally identical processors (the slaves). A trace-driven simulation program modeling this system was constructed, and was run with tracemore » output taken from two different simulation programs. Output from this simulation suggests that a significant reduction in run time may be realized by this approach. Sensitivity analysis was performed on the significant parameters to the system (number of slave processors, relative processor speeds, and interprocessor communication times). A comparison between actual and simulation run times for a one-processor system was used to assist in the validation of the simulation. 7 references.« less

  12. Computer model to simulate testing at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

    1995-01-01

    A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

  13. Making Water Pollution a Problem in the Classroom Through Computer Assisted Instruction.

    ERIC Educational Resources Information Center

    Flowers, John D.

    Alternative means for dealing with water pollution control are presented for students and teachers. One computer oriented program is described in terms of teaching wastewater treatment and pollution concepts to middle and secondary school students. Suggestions are given to help teachers use a computer simulation program in their classrooms.…

  14. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  15. Man-in-the-control-loop simulation of manipulators

    NASA Technical Reports Server (NTRS)

    Chang, J. L.; Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    A method to achieve man-in-the-control-loop simulation is presented. Emerging real-time dynamics simulation suggests a potential for creating an interactive design workstation with a human operator in the control loop. The recursive formulation for multibody dynamics simulation is studied to determine requirements for man-in-the-control-loop simulation. High speed computer graphics techniques provides realistic visual cues for the simulator. Backhoe and robot arm simulations are implemented to demonstrate the capability of man-in-the-control-loop simulation.

  16. Computer-simulated laboratory explorations for middle school life, earth, and physical Science

    NASA Astrophysics Data System (ADS)

    von Blum, Ruth

    1992-06-01

    Explorations in Middle School Science is a set of 72 computer-simulated laboratory lessons in life, earth, and physical Science for grades 6 9 developed by Jostens Learning Corporation with grants from the California State Department of Education and the National Science Foundation.3 At the heart of each lesson is a computer-simulated laboratory that actively involves students in doing science improving their: (1) understanding of science concepts by applying critical thinking to solve real problems; (2) skills in scientific processes and communications; and (3) attitudes about science. Students use on-line tools (notebook, calculator, word processor) to undertake in-depth investigations of phenomena (like motion in outer space, disease transmission, volcanic eruptions, or the structure of the atom) that would be too difficult, dangerous, or outright impossible to do in a “live” laboratory. Suggested extension activities lead students to hands-on investigations, away from the computer. This article presents the underlying rationale, instructional model, and process by which Explorations was designed and developed. It also describes the general courseware structure and three lesson's in detail, as well as presenting preliminary data from the evaluation. Finally, it suggests a model for incorporating technology into the science classroom.

  17. Alpha absolute power measurement in panic disorder with agoraphobia patients.

    PubMed

    de Carvalho, Marcele Regine; Velasques, Bruna Brandão; Freire, Rafael C; Cagy, Maurício; Marques, Juliana Bittencourt; Teixeira, Silmar; Rangé, Bernard P; Piedade, Roberto; Ribeiro, Pedro; Nardi, Antonio Egidio; Akiskal, Hagop Souren

    2013-10-01

    Panic attacks are thought to be a result from a dysfunctional coordination of cortical and brainstem sensory information leading to heightened amygdala activity with subsequent neuroendocrine, autonomic and behavioral activation. Prefrontal areas may be responsible for inhibitory top-down control processes and alpha synchronization seems to reflect this modulation. The objective of this study was to measure frontal absolute alpha-power with qEEG in 24 subjects with panic disorder and agoraphobia (PDA) compared to 21 healthy controls. qEEG data were acquired while participants watched a computer simulation, consisting of moments classified as "high anxiety"(HAM) and "low anxiety" (LAM). qEEG data were also acquired during two rest conditions, before and after the computer simulation display. We observed a higher absolute alpha-power in controls when compared to the PDA patients while watching the computer simulation. The main finding was an interaction between the moment and group factors on frontal cortex. Our findings suggest that the decreased alpha-power in the frontal cortex for the PDA group may reflect a state of high excitability. Our results suggest a possible deficiency in top-down control processes of anxiety reflected by a low absolute alpha-power in the PDA group while watching the computer simulation and they highlight that prefrontal regions and frontal region nearby the temporal area are recruited during the exposure to anxiogenic stimuli. © 2013 Elsevier B.V. All rights reserved.

  18. Demonstrating Newton's Third Law: Changing Aristotelian Viewpoints.

    ERIC Educational Resources Information Center

    Roach, Linda E.

    1992-01-01

    Suggests techniques to help eliminate students' misconceptions involving Newton's Third Law. Approaches suggested include teaching physics from a historical perspective, using computer programs with simulations, rewording the law, drawing free-body diagrams, and using demonstrations and examples. (PR)

  19. Using computer simulation to improve high order thinking skills of physics teacher candidate students in Compton effect

    NASA Astrophysics Data System (ADS)

    Supurwoko; Cari; Sarwanto; Sukarmin; Fauzi, Ahmad; Faradilla, Lisa; Summa Dewi, Tiarasita

    2017-11-01

    The process of learning and teaching in Physics is often confronted with abstract concepts. It makes difficulty for students to understand and teachers to teach the concept. One of the materials that has an abstract concept is Compton Effect. The purpose of this research is to evaluate computer simulation model on Compton Effect material which is used to improve high thinking ability of Physics teacher candidate students. This research is a case study. The subject is students at physics educations who have attended Modern Physics lectures. Data were obtained through essay test for measuring students’ high-order thinking skills and quisioners for measuring students’ responses. The results obtained indicate that computer simulation model can be used to improve students’ high order thinking skill and can be used to improve students’ responses. With this result it is suggested that the audiences use the simulation media in learning

  20. Simulation of the photodetachment spectrum of HHfO- using coupled-cluster calculations

    NASA Astrophysics Data System (ADS)

    Mok, Daniel K. W.; Dyke, John M.; Lee, Edmond P. F.

    2016-12-01

    The photodetachment spectrum of HHfO- was simulated using restricted-spin coupled-cluster single-double plus perturbative triple {RCCSD(T)} calculations performed on the ground electronic states of HHfO and HHfO-, employing basis sets of up to quintuple-zeta quality. The computed RCCSD(T) electron affinity of 1.67 ± 0.02 eV at the complete basis set limit, including Hf 5s25p6 core correlation and zero-point energy corrections, agrees well with the experimental value of 1.70 ± 0.05 eV from a recent photodetachment study [X. Li et al., J. Chem. Phys. 136, 154306 (2012)]. For the simulation, Franck-Condon factors were computed which included allowances for anharmonicity and Duschinsky rotation. Comparisons between simulated and experimental spectra confirm the assignments of the molecular carrier and electronic states involved but suggest that the experimental vibrational structure has suffered from poor signal-to-noise ratio. An alternative assignment of the vibrational structure to that suggested in the experimental work is presented.

  1. Differences in simulated fire spread over Askervein Hill using two advanced wind models and a traditional uniform wind field

    Treesearch

    Jason Forthofer; Bret Butler

    2007-01-01

    A computational fluid dynamics (CFD) model and a mass-consistent model were used to simulate winds on simulated fire spread over a simple, low hill. The results suggest that the CFD wind field could significantly change simulated fire spread compared to traditional uniform winds. The CFD fire spread case may match reality better because the winds used in the fire...

  2. A computable expression of closure to efficient causation.

    PubMed

    Mossio, Matteo; Longo, Giuseppe; Stewart, John

    2009-04-07

    In this paper, we propose a mathematical expression of closure to efficient causation in terms of lambda-calculus; we argue that this opens up the perspective of developing principled computer simulations of systems closed to efficient causation in an appropriate programming language. An important implication of our formulation is that, by exhibiting an expression in lambda-calculus, which is a paradigmatic formalism for computability and programming, we show that there are no conceptual or principled problems in realizing a computer simulation or model of closure to efficient causation. We conclude with a brief discussion of the question whether closure to efficient causation captures all relevant properties of living systems. We suggest that it might not be the case, and that more complex definitions could indeed create crucial some obstacles to computability.

  3. Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Hinkley, Jeffrey A.

    2003-01-01

    The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.

  4. Combining neural networks and signed particles to simulate quantum systems more efficiently

    NASA Astrophysics Data System (ADS)

    Sellier, Jean Michel

    2018-04-01

    Recently a new formulation of quantum mechanics has been suggested which describes systems by means of ensembles of classical particles provided with a sign. This novel approach mainly consists of two steps: the computation of the Wigner kernel, a multi-dimensional function describing the effects of the potential over the system, and the field-less evolution of the particles which eventually create new signed particles in the process. Although this method has proved to be extremely advantageous in terms of computational resources - as a matter of fact it is able to simulate in a time-dependent fashion many-body systems on relatively small machines - the Wigner kernel can represent the bottleneck of simulations of certain systems. Moreover, storing the kernel can be another issue as the amount of memory needed is cursed by the dimensionality of the system. In this work, we introduce a new technique which drastically reduces the computation time and memory requirement to simulate time-dependent quantum systems which is based on the use of an appropriately tailored neural network combined with the signed particle formalism. In particular, the suggested neural network is able to compute efficiently and reliably the Wigner kernel without any training as its entire set of weights and biases is specified by analytical formulas. As a consequence, the amount of memory for quantum simulations radically drops since the kernel does not need to be stored anymore as it is now computed by the neural network itself, only on the cells of the (discretized) phase-space which are occupied by particles. As its is clearly shown in the final part of this paper, not only this novel approach drastically reduces the computational time, it also remains accurate. The author believes this work opens the way towards effective design of quantum devices, with incredible practical implications.

  5. Facilitating higher-fidelity simulations of axial compressor instability and other turbomachinery flow conditions

    NASA Astrophysics Data System (ADS)

    Herrick, Gregory Paul

    The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.

  6. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  7. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  8. A Special Topic From Nuclear Reactor Dynamics for the Undergraduate Physics Curriculum

    ERIC Educational Resources Information Center

    Sevenich, R. A.

    1977-01-01

    Presents an intuitive derivation of the point reactor equations followed by formulation of equations for inverse and direct kinetics which are readily programmed on a digital computer. Suggests several computer simulations involving the effect of control rod motion on reactor power. (MLH)

  9. Computation, prediction, and experimental tests of fitness for bacteriophage T7 mutants with permuted genomes

    NASA Astrophysics Data System (ADS)

    Endy, Drew; You, Lingchong; Yin, John; Molineux, Ian J.

    2000-05-01

    We created a simulation based on experimental data from bacteriophage T7 that computes the developmental cycle of the wild-type phage and also of mutants that have an altered genome order. We used the simulation to compute the fitness of more than 105 mutants. We tested these computations by constructing and experimentally characterizing T7 mutants in which we repositioned gene 1, coding for T7 RNA polymerase. Computed protein synthesis rates for ectopic gene 1 strains were in moderate agreement with observed rates. Computed phage-doubling rates were close to observations for two of four strains, but significantly overestimated those of the other two. Computations indicate that the genome organization of wild-type T7 is nearly optimal for growth: only 2.8% of random genome permutations were computed to grow faster, the highest 31% faster, than wild type. Specific discrepancies between computations and observations suggest that a better understanding of the translation efficiency of individual mRNAs and the functions of qualitatively "nonessential" genes will be needed to improve the T7 simulation. In silico representations of biological systems can serve to assess and advance our understanding of the underlying biology. Iteration between computation, prediction, and observation should increase the rate at which biological hypotheses are formulated and tested.

  10. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods

    PubMed Central

    Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.

    2011-01-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  11. Application of artificial neural networks to gaming

    NASA Astrophysics Data System (ADS)

    Baba, Norio; Kita, Tomio; Oda, Kazuhiro

    1995-04-01

    Recently, neural network technology has been applied to various actual problems. It has succeeded in producing a large number of intelligent systems. In this article, we suggest that it could be applied to the field of gaming. In particular, we suggest that the neural network model could be used to mimic players' characters. Several computer simulation results using a computer gaming system which is a modified version of the COMMONS GAME confirm our idea.

  12. Computer Simulations of Resonant Coherent Excitation of Heavy Hydrogen-Like Ions Under Planar Channeling

    NASA Astrophysics Data System (ADS)

    Babaev, A. A.; Pivovarov, Yu L.

    2010-04-01

    Resonant coherent excitation (RCE) of relativistic hydrogen-like ions is investigated by computer simulations methods. The suggested theoretical model is applied to the simulations of recent experiments on RCE of 390 MeV/u Ar17+ ions under (220) planar channeling in a Si crystal performed by T.Azuma et al at HIMAC (Tokyo). Theoretical results are in a good agreement with these experimental data and clearly show the appearance of the doublet structure of RCE peaks. The simulations are also extended to greater ion energies in order to predict the new RCE features at the future accelerator facility FAIR OSI and as an example, RCE of II GeV/u U91+ ions is considered in detail.

  13. Education Calls for a New Philosophy.

    ERIC Educational Resources Information Center

    Scheidlinger, Zygmunt

    1999-01-01

    Highlights changes brought on by computers and technological advancement and notes that only those with a vision of the future can direct and participate in the evolution of education. Suggests that virtual reality, simulation, animation and other computer-based features will render traditional class learning futile and that computerized education…

  14. Application of Psychological Theories in Agent-Based Modeling: The Case of the Theory of Planned Behavior.

    PubMed

    Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo

    2018-01-01

    It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.

  15. The Promise of Quantum Simulation.

    PubMed

    Muller, Richard P; Blume-Kohout, Robin

    2015-08-25

    Quantum simulations promise to be one of the primary applications of quantum computers, should one be constructed. This article briefly summarizes the history of quantum simulation in light of the recent result of Wang and co-workers, demonstrating calculation of the ground and excited states for a HeH(+) molecule, and concludes with a discussion of why this and other recent progress in the field suggest that quantum simulations of quantum chemistry have a bright future.

  16. Experimental measurement of microwave ablation heating pattern and comparison to computer simulations.

    PubMed

    Deshazer, Garron; Prakash, Punit; Merck, Derek; Haemmerich, Dieter

    2017-02-01

    For computational models of microwave ablation (MWA), knowledge of the antenna design is necessary, but the proprietary design of clinical applicators is often unknown. We characterised the specific absorption rate (SAR) during MWA experimentally and compared to a multi-physics simulation. An infrared (IR) camera was used to measure SAR during MWA within a split ex vivo liver model. Perseon Medical's short-tip (ST) or long-tip (LT) MWA antenna were placed on top of a tissue sample (n = 6), and microwave power (15 W) was applied for 6 min, while intermittently interrupting power. Tissue surface temperature was recorded via IR camera (3.3 fps, 320 × 240 resolution). SAR was calculated intermittently based on temperature slope before and after power interruption. Temperature and SAR data were compared to simulation results. Experimentally measured SAR changed considerably once tissue temperatures exceeded 100 °C, contrary to simulation results. The ablation zone diameters were 1.28 cm and 1.30 ± 0.03 cm (transverse), and 2.10 cm and 2.66 ± -0.22 cm (axial), for simulation and experiment, respectively. The average difference in temperature between the simulation and experiment were 5.6 °C (ST) and 6.2 °C (LT). Dice coefficients for 1000 W/kg SAR iso-contour were 0.74 ± 0.01 (ST) and 0.77 (± 0.03) (LT), suggesting good agreement of SAR contours. We experimentally demonstrated changes in SAR during MWA ablation, which were not present in simulation, suggesting inaccuracies in dielectric properties. The measured SAR may be used in simplified computer simulations to predict tissue temperature when the antenna geometry is unknown.

  17. Removing systematic errors in interionic potentials of mean force computed in molecular simulations using reaction-field-based electrostatics

    PubMed Central

    Baumketner, Andrij

    2009-01-01

    The performance of reaction-field methods to treat electrostatic interactions is tested in simulations of ions solvated in water. The potential of mean force between sodium chloride pair of ions and between side chains of lysine and aspartate are computed using umbrella sampling and molecular dynamics simulations. It is found that in comparison with lattice sum calculations, the charge-group-based approaches to reaction-field treatments produce a large error in the association energy of the ions that exhibits strong systematic dependence on the size of the simulation box. The atom-based implementation of the reaction field is seen to (i) improve the overall quality of the potential of mean force and (ii) remove the dependence on the size of the simulation box. It is suggested that the atom-based truncation be used in reaction-field simulations of mixed media. PMID:19292522

  18. Using Palm Technology in Participatory Simulations of Complex Systems: A New Take on Ubiquitous and Accessible Mobile Computing

    NASA Astrophysics Data System (ADS)

    Klopfer, Eric; Yoon, Susan; Perry, Judy

    2005-09-01

    This paper reports on teachers' perceptions of the educational affordances of a handheld application called Participatory Simulations. It presents evidence from five cases representing each of the populations who work with these computational tools. Evidence across multiple data sources yield similar results to previous research evaluations of handheld activities with respect to enhancing motivation, engagement and self-directed learning. Three additional themes are discussed that provide insight into understanding curricular applicability of Participatory Simulations that suggest a new take on ubiquitous and accessible mobile computing. These themes generally point to the multiple layers of social and cognitive flexibility intrinsic to their design: ease of adaptation to subject-matter content knowledge and curricular integration; facility in attending to teacher-individualized goals; and encouraging the adoption of learner-centered strategies.

  19. The effectiveness of using computer simulated experiments on junior high students' understanding of the volume displacement concept

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soon; Gennaro, Eugene

    Several researchers have suggested that the computer holds much promise as a tool for science teachers for use in their classrooms (Bork, 1979, Lunetta & Hofstein, 1981). It also has been said that there needs to be more research in determining the effectiveness of computer software (Tinker, 1983).This study compared the effectiveness of microcomputer simulated experiences with that of parallel instruction involving hands-on laboratory experiences for teaching the concept of volume displacement to junior high school students. This study also assessed the differential effect on students' understanding of the volume displacement concept using sex of the students as another independent variable. In addition, it compared the degree of retention, after 45 days, of both treatment groups.It was found that computer simulated experiences were as effective as hands-on laboratory experiences, and that males, having had hands-on laboratory experiences, performed better on the posttest than females having had the hands-on laboratory experiences. There were no significant differences in performance when comparing males with females using the computer simulation in the learning of the displacement concept. This study also showed that there were no significant differences in the retention levels when the retention scores of the computer simulation groups were compared to those that had the hands-on laboratory experiences. However, an ANOVA of the retention test scores revealed that males in both treatment conditions retained knowledge of volume displacement better than females.

  20. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  1. The promise of quantum simulation

    DOE PAGES

    Muller, Richard P.; Blume-Kohout, Robin

    2015-07-21

    In this study, quantum simulations promise to be one of the primary applications of quantum computers, should one be constructed. This article briefly summarizes the history of quantum simulation in light of the recent result of Wang and co-workers, demonstrating calculation of the ground and excited states for a HeH + molecule, and concludes with a discussion of why this and other recent progress in the field suggest that quantum simulations of quantum chemistry have a bright future.

  2. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2009-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations show that the presence of the "blade" seal at the cusp in the simulated geometry significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, the computations suggest that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  3. Dynamic Mesh CFD Simulations of Orion Parachute Pendulum Motion During Atmospheric Entry

    NASA Technical Reports Server (NTRS)

    Halstrom, Logan D.; Schwing, Alan M.; Robinson, Stephen K.

    2016-01-01

    This paper demonstrates the usage of computational fluid dynamics to study the effects of pendulum motion dynamics of the NASAs Orion Multi-Purpose Crew Vehicle parachute system on the stability of the vehicles atmospheric entry and decent. Significant computational fluid dynamics testing has already been performed at NASAs Johnson Space Center, but this study sought to investigate the effect of bulk motion of the parachute, such as pitching, on the induced aerodynamic forces. Simulations were performed with a moving grid geometry oscillating according to the parameters observed in flight tests. As with the previous simulations, OVERFLOW computational fluid dynamics tool is used with the assumption of rigid, non-permeable geometry. Comparison to parachute wind tunnel tests is included for a preliminary validation of the dynamic mesh model. Results show qualitative differences in the flow fields of the static and dynamic simulations and quantitative differences in the induced aerodynamic forces, suggesting that dynamic mesh modeling of the parachute pendulum motion may uncover additional dynamic effects.

  4. In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway

    NASA Astrophysics Data System (ADS)

    Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun

    2016-12-01

    HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.

  5. Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy

    PubMed Central

    Ackermann, Marko; van den Bogert, Antonie J.

    2012-01-01

    The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. PMID:22365845

  6. Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy.

    PubMed

    Ackermann, Marko; van den Bogert, Antonie J

    2012-04-30

    The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Asthma management simulation for children: translating theory, methods, and strategies to effect behavior change.

    PubMed

    Shegog, Ross; Bartholomew, L Kay; Gold, Robert S; Pierrel, Elaine; Parcel, Guy S; Sockrider, Marianna M; Czyzewski, Danita I; Fernandez, Maria E; Berlin, Nina J; Abramson, Stuart

    2006-01-01

    Translating behavioral theories, models, and strategies to guide the development and structure of computer-based health applications is well recognized, although a continued challenge for program developers. A stepped approach to translate behavioral theory in the design of simulations to teach chronic disease management to children is described. This includes the translation steps to: 1) define target behaviors and their determinants, 2) identify theoretical methods to optimize behavioral change, and 3) choose educational strategies to effectively apply these methods and combine these into a cohesive computer-based simulation for health education. Asthma is used to exemplify a chronic health management problem and a computer-based asthma management simulation (Watch, Discover, Think and Act) that has been evaluated and shown to effect asthma self-management in children is used to exemplify the application of theory to practice. Impact and outcome evaluation studies have indicated the effectiveness of these steps in providing increased rigor and accountability, suggesting their utility for educators and developers seeking to apply simulations to enhance self-management behaviors in patients.

  8. Reactive multiphase flow simulation workshop summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VanderHeyden, W.B.

    1995-09-01

    A workshop on computer simulation of reactive multiphase flow was held on May 18 and 19, 1995 in the Computational Testbed for Industry at Los Alamos National Laboratory (LANL), Los Alamos, New Mexico. Approximately 35 to 40 people attended the workshop. This included 21 participants from 12 companies representing the petroleum, chemical, environmental and consumer products industries, two representatives from the DOE Office of Industrial Technologies and several from Los Alamos. The dialog at the meeting suggested that reactive multiphase flow simulation represents an excellent candidate for government/industry/academia collaborative research. A white paper on a potential consortium for reactive multiphasemore » flow with input from workshop participants will be issued separately.« less

  9. An investigation of the use of microcomputer-based laboratory simulations in promoting conceptual understanding in secondary physics instruction

    NASA Astrophysics Data System (ADS)

    Tomshaw, Stephen G.

    Physics education research has shown that students bring alternate conceptions to the classroom which can be quite resistant to traditional instruction methods (Clement, 1982; Halloun & Hestenes, 1985; McDermott, 1991). Microcomputer-based laboratory (MBL) experiments that employ an active-engagement strategy have been shown to improve student conceptual understanding in high school and introductory university physics courses (Thornton & Sokoloff, 1998). These (MBL) experiments require a specialized computer interface, type-specific sensors (e.g. motion detectors, force probes, accelerometers), and specialized software in addition to the standard physics experimental apparatus. Tao and Gunstone (1997) have shown that computer simulations used in an active engagement environment can also lead to conceptual change. This study investigated 69 secondary physics students' use of computer simulations of MBL activities in place of the hands-on MBL laboratory activities. The average normalized gain in students' conceptual understanding was measured using the Force and Motion Conceptual Evaluation (FMCE). Student attitudes towards physics and computers were probed using the Views About Science Survey (VASS) and the Computer Attitude Scale (CAS). While it may be possible to obtain an equivalent level of conceptual understanding using computer simulations in combination with an active-engagement environment, this study found no significant gains in students' conceptual understanding ( = -0.02) after they completed a series of nine simulated experiments from the Tools for Scientific Thinking curriculum (Thornton & Sokoloff, 1990). The absence of gains in conceptual understanding may indicate that either the simulations were ineffective in promoting conceptual change or problems with the implementation of the treatment inhibited its effectiveness. There was a positive shift in students' attitudes towards physics in the VASS dimensions of structure and reflective thinking, while there was a negative shift in students' attitudes towards computers in the CAS subscales of anxiety and usefulness. The negative shift in attitudes towards computers may be due to the additional time and work required by the students to perform the simulation experiments with no apparent reward in terms of their physics grade. Suggestions for future research include a qualitative element to observe student interactions and alternate formats for the simulations themselves.

  10. Introducing computational thinking through hands-on projects using R with applications to calculus, probability and data analysis

    NASA Astrophysics Data System (ADS)

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-04-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.

  11. Subject-specific computer simulation model for determining elbow loading in one-handed tennis backhand groundstrokes.

    PubMed

    King, Mark A; Glynn, Jonathan A; Mitchell, Sean R

    2011-11-01

    A subject-specific angle-driven computer model of a tennis player, combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts, was developed to determine the effect of ball-racket impacts on loading at the elbow for one-handed backhand groundstrokes. Matching subject-specific computer simulations of a typical topspin/slice one-handed backhand groundstroke performed by an elite tennis player were done with root mean square differences between performance and matching simulations of < 0.5 degrees over a 50 ms period starting from ball impact. Simulation results suggest that for similar ball-racket impact conditions, the difference in elbow loading for a topspin and slice one-handed backhand groundstroke is relatively small. In this study, the relatively small differences in elbow loading may be due to comparable angle-time histories at the wrist and elbow joints with the major kinematic differences occurring at the shoulder. Using a subject-specific angle-driven computer model combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts allows peak internal loading, net impulse, and shock due to ball-racket impact to be calculated which would not otherwise be possible without impractical invasive techniques. This study provides a basis for further investigation of the factors that may increase elbow loading during tennis strokes.

  12. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Rives, T. B.

    1987-01-01

    An analytical analysis of the HOSC Generic Peripheral processing system was conducted. The results are summarized and they indicate that the maximum delay in performing screen change requests should be less than 2.5 sec., occurring for a slow VAX host to video screen I/O rate of 50 KBps. This delay is due to the average I/O rate from the video terminals to their host computer. Software structure of the main computers and the host computers will have greater impact on screen change or refresh response times. The HOSC data system model was updated by a newly coded PASCAL based simulation program which was installed on the HOSC VAX system. This model is described and documented. Suggestions are offered to fine tune the performance of the ETERNET interconnection network. Suggestions for using the Nutcracker by Excelan to trace itinerate packets which appear on the network from time to time were offered in discussions with the HOSC personnel. Several visits to the HOSC facility were to install and demonstrate the simulation model.

  13. Learning Nonadjacent Dependencies: No Need for Algebraic-Like Computations

    ERIC Educational Resources Information Center

    Perruchet, Pierre; Tyler, Michael D.; Galland, Nadine; Peereman, Ronald

    2004-01-01

    Is it possible to learn the relation between 2 nonadjacent events? M. Pena, L. L. Bonatti, M. Nespor, and J. Mehler (2002) claimed this to be possible, but only in conditions suggesting the involvement of algebraic-like computations. The present article reports simulation studies and experimental data showing that the observations on which Pe?a et…

  14. A computational study of liposome logic: towards cellular computing from the bottom up

    PubMed Central

    Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron

    2010-01-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and implementation of logic circuits through bacterial genome reengineering. The second contribution in this paper is the demonstration of a novel set of tools for the specification, modelling and analysis of “bottom-up” liposome logic. In particular, simulation and modelling techniques are used to analyse some example liposome logic designs, ranging from relatively simple NOT gates and NAND gates to SR-Latches, D Flip-Flops all the way to 3 bit ripple counters. The approach we propose consists of specifying, by means of P systems, gene regulatory network-like systems operating inside proto-membranes. This P systems specification can be automatically translated and executed through a multiscaled pipeline composed of dissipative particle dynamics (DPD) simulator and Gillespie’s stochastic simulation algorithm (SSA). Finally, model selection and analysis can be performed through a model checking phase. This is the first paper we are aware of that brings to bear formal specifications, DPD, SSA and model checking to the problem of modeling target computational functionality in protocells. Potential chemical routes for the laboratory implementation of these simulations are also discussed thus for the first time suggesting a potentially realistic physiochemical implementation for membrane computing from the bottom-up. PMID:21886681

  15. Parameter Estimation for a Turbulent Buoyant Jet Using Approximate Bayesian Computation

    NASA Astrophysics Data System (ADS)

    Christopher, Jason D.; Wimer, Nicholas T.; Hayden, Torrey R. S.; Lapointe, Caelan; Grooms, Ian; Rieker, Gregory B.; Hamlington, Peter E.

    2016-11-01

    Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other "truth" data to be used for the prediction of unknown model parameters in numerical simulations of real-world engineering systems. In this presentation, we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a simulation with known boundary conditions and problem parameters. Using spatially-sparse temperature statistics from the 2D buoyant jet truth simulation, we show that the ABC method provides accurate predictions of the true jet inflow temperature. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for engineering fluid dynamics research.

  16. Quantitative, steady-state properties of Catania's computational model of the operant reserve.

    PubMed

    Berg, John P; McDowell, J J

    2011-05-01

    Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Efficient electron open boundaries for simulating electrochemical cells

    NASA Astrophysics Data System (ADS)

    Zauchner, Mario G.; Horsfield, Andrew P.; Todorov, Tchavdar N.

    2018-01-01

    Nonequilibrium electrochemistry raises new challenges for atomistic simulation: we need to perform molecular dynamics for the nuclear degrees of freedom with an explicit description of the electrons, which in turn must be free to enter and leave the computational cell. Here we present a limiting form for electron open boundaries that we expect to apply when the magnitude of the electric current is determined by the drift and diffusion of ions in a solution and which is sufficiently computationally efficient to be used with molecular dynamics. We present tight-binding simulations of a parallel-plate capacitor with nothing, a dimer, or an atomic wire situated in the space between the plates. These simulations demonstrate that this scheme can be used to perform molecular dynamics simulations when there is an applied bias between two metal plates with, at most, weak electronic coupling between them. This simple system captures some of the essential features of an electrochemical cell, suggesting this approach might be suitable for simulations of electrochemical cells out of equilibrium.

  18. Large-Scale Simulations of Plastic Neural Networks on Neuromorphic Hardware

    PubMed Central

    Knight, James C.; Tully, Philip J.; Kaplan, Bernhard A.; Lansner, Anders; Furber, Steve B.

    2016-01-01

    SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 2.0 × 104 neurons and 5.1 × 107 plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately 45× more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models. PMID:27092061

  19. Numerical simulations of clinical focused ultrasound functional neurosurgery

    NASA Astrophysics Data System (ADS)

    Pulkkinen, Aki; Werner, Beat; Martin, Ernst; Hynynen, Kullervo

    2014-04-01

    A computational model utilizing grid and finite difference methods were developed to simulate focused ultrasound functional neurosurgery interventions. The model couples the propagation of ultrasound in fluids (soft tissues) and solids (skull) with acoustic and visco-elastic wave equations. The computational model was applied to simulate clinical focused ultrasound functional neurosurgery treatments performed in patients suffering from therapy resistant chronic neuropathic pain. Datasets of five patients were used to derive the treatment geometry. Eight sonications performed in the treatments were then simulated with the developed model. Computations were performed by driving the simulated phased array ultrasound transducer with the acoustic parameters used in the treatments. Resulting focal temperatures and size of the thermal foci were compared quantitatively, in addition to qualitative inspection of the simulated pressure and temperature fields. This study found that the computational model and the simulation parameters predicted an average of 24 ± 13% lower focal temperature elevations than observed in the treatments. The size of the simulated thermal focus was found to be 40 ± 13% smaller in the anterior-posterior direction and 22 ± 14% smaller in the inferior-superior direction than in the treatments. The location of the simulated thermal focus was off from the prescribed target by 0.3 ± 0.1 mm, while the peak focal temperature elevation observed in the measurements was off by 1.6 ± 0.6 mm. Although the results of the simulations suggest that there could be some inaccuracies in either the tissue parameters used, or in the simulation methods, the simulations were able to predict the focal spot locations and temperature elevations adequately for initial treatment planning performed to assess, for example, the feasibility of sonication. The accuracy of the simulations could be improved if more precise ultrasound tissue properties (especially of the skull bone) could be obtained.

  20. Review of selected features of the natural system model, and suggestions for applications in South Florida

    USGS Publications Warehouse

    Bales, Jerad; Fulford, Janice M.; Swain, Eric D.

    1997-01-01

    A study was conducted to review selected features of the Natural System Model, version 4.3 . The Natural System Model is a regional-scale model that uses recent climatic data and estimates of historic vegetation and topography to simulate pre-canal-drainage hydrologic response in south Florida. Equations used to represent the hydrologic system and the numerical solution of these equations in the model were documented and reviewed. Convergence testing was performed using 1965 input data, and selected other aspects of the model were evaluated.Some conclusions from the evaluation of the Natural System Model include the following observations . Simulations were generally insensitive to the temporal resolution used in the model. However, reduction of the computational cell size from 2-mile by 2-mile to 2/3-mile by 2/3-mile resulted in a decrease in spatial mean ponding depths for October of 0.35 foot for a 3-hour time step.Review of the computer code indicated that there is no limit on the amount of water that can be transferred from the river system to the overland flow system, on the amount of seepage from the river to the ground-water system, on evaporation from the river system, or on evapotranspiration from the overland-flow system . Oscillations of 0.2 foot or less in simulated river stage were identified and attributed to a volume limiting function which is applied in solution of the overland-flow equations. The computation of the resistance coefficient is not consistent with the computation of overland-flow velocity. Ground-water boundary conditions do not always ensure a no-flow condition at the boundary. These inconsistencies had varying degrees of effects on model simulations, and it is likely that simulations longer than 1 year are needed to fully identify effects. However, inconsistencies in model formulations should not be ignored, even if the effects of such errors on model results appear to be small or have not been clearly defined.The Natural System Model can be a very useful tool for estimating pre-drainage hydrologic response in south Florida. The model includes all of the important physical processes needed to simulate a water balance. With a few exceptions, these hydrologic processes are represented in a reasonable manner using empirical, semiempirical, and mechanistic relations . The data sets that have been assembled to represent physical features, and hydrologic and meteorological conditions are quite extensive in their scope.Some suggestions for model application were made. Simulation results from the Natural System Model need to be interpreted on a regional basis, rather than cell by cell. The available evidence suggests that simulated water levels should be interpreted with about a plus or minus 1 foot uncertainty. It is probably not appropriate to use the Natural System Model to estimate pre-drainage discharges (as opposed to hydroperiods and water levels) at a particular location or across a set of adjacent computational cells. All simulated results for computational cells within about 10 miles of the model boundaries have a higher degree of uncertainty than results for the interior of the model domain. It is most appropriate to interpret the Natural System Model simulation results in connection with other available information. Stronger linkages between hydrologic inputs to the Everglades and the ecological response of the system would enhance restoration efforts .

  1. Experimental and computational investigation of lateral gauge response in polycarbonate

    NASA Astrophysics Data System (ADS)

    Eliot, Jim; Harris, Ernst; Hazell, Paul; Appleby-Thomas, Gareth; Winter, Ronald; Wood, David; Owen, Gareth

    2011-06-01

    Polycarbonate's use in personal armour systems means its high strain-rate response has been extensively studied. Interestingly, embedded lateral manganin stress gauges in polycarbonate have shown gradients behind incident shocks, suggestive of increasing shear strength. However, such gauges need to be embedded in a central (typically) epoxy interlayer - an inherently invasive approach. Recently, research has suggested that in such metal systems interlayer/target impedance may contribute to observed gradients in lateral stress. Here, experimental T-gauge (Vishay Micro-Measurements® type J2M-SS-580SF-025) traces from polycarbonate targets are compared to computational simulations. This work extends previous efforts such that similar impedance exists between the interlayer and matrix (target) interface. Further, experiments and simulations are presented investigating the effects of a ``dry joint'' in polycarbonate, in which no encapsulating medium is employed.

  2. The Influence of Visual and Spatial Reasoning in Interpreting Simulated 3D Worlds.

    ERIC Educational Resources Information Center

    Lowrie, Tom

    2002-01-01

    Explores ways in which 6-year-old children make sense of screen-based images on the computer. Uses both static and relatively dynamic software programs in the investigation. Suggests that young children should be exposed to activities that establish explicit links between 2D and 3D objects away from the computer before attempting difficult links…

  3. Magnetoacoustic Tomography with Magnetic Induction (MAT-MI) for Breast Tumor Imaging: Numerical Modeling and Simulation

    PubMed Central

    Zhou, Lian; Li, Xu; Zhu, Shanan; He, Bin

    2011-01-01

    Magnetoacoustic tomography with magnetic induction (MAT-MI) was recently introduced as a noninvasive electrical conductivity imaging approach with high spatial resolution close to ultrasound imaging. In the present study, we test the feasibility of the MAT-MI method for breast tumor imaging using numerical modeling and computer simulation. Using the finite element method, we have built three dimensional numerical breast models with varieties of embedded tumors for this simulation study. In order to obtain an accurate and stable forward solution that does not have numerical errors caused by singular MAT-MI acoustic sources at conductivity boundaries, we first derive an integral forward method for calculating MAT-MI acoustic sources over the entire imaging volume. An inverse algorithm for reconstructing the MAT-MI acoustic source is also derived with spherical measurement aperture, which simulates a practical setup for breast imaging. With the numerical breast models, we have conducted computer simulations under different imaging parameter setups and all the results suggest that breast tumors that have large conductivity contrast to its surrounding tissues as reported in literature may be readily detected in the reconstructed MAT-MI images. In addition, our simulations also suggest that the sensitivity of imaging breast tumors using the presented MAT-MI setup depends more on the tumor location and the conductivity contrast between the tumor and its surrounding tissues than on the tumor size. PMID:21364262

  4. Utilizing fast multipole expansions for efficient and accurate quantum-classical molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Schwörer, Magnus; Lorenzen, Konstantin; Mathias, Gerald; Tavan, Paul

    2015-03-01

    Recently, a novel approach to hybrid quantum mechanics/molecular mechanics (QM/MM) molecular dynamics (MD) simulations has been suggested [Schwörer et al., J. Chem. Phys. 138, 244103 (2013)]. Here, the forces acting on the atoms are calculated by grid-based density functional theory (DFT) for a solute molecule and by a polarizable molecular mechanics (PMM) force field for a large solvent environment composed of several 103-105 molecules as negative gradients of a DFT/PMM hybrid Hamiltonian. The electrostatic interactions are efficiently described by a hierarchical fast multipole method (FMM). Adopting recent progress of this FMM technique [Lorenzen et al., J. Chem. Theory Comput. 10, 3244 (2014)], which particularly entails a strictly linear scaling of the computational effort with the system size, and adapting this revised FMM approach to the computation of the interactions between the DFT and PMM fragments of a simulation system, here, we show how one can further enhance the efficiency and accuracy of such DFT/PMM-MD simulations. The resulting gain of total performance, as measured for alanine dipeptide (DFT) embedded in water (PMM) by the product of the gains in efficiency and accuracy, amounts to about one order of magnitude. We also demonstrate that the jointly parallelized implementation of the DFT and PMM-MD parts of the computation enables the efficient use of high-performance computing systems. The associated software is available online.

  5. Explicit finite-difference simulation of optical integrated devices on massive parallel computers.

    PubMed

    Sterkenburgh, T; Michels, R M; Dress, P; Franke, H

    1997-02-20

    An explicit method for the numerical simulation of optical integrated circuits by means of the finite-difference time-domain (FDTD) method is presented. This method, based on an explicit solution of Maxwell's equations, is well established in microwave technology. Although the simulation areas are small, we verified the behavior of three interesting problems, especially nonparaxial problems, with typical aspects of integrated optical devices. Because numerical losses are within acceptable limits, we suggest the use of the FDTD method to achieve promising quantitative simulation results.

  6. Computer simulation of solutions of polyharmonic equations in plane domain

    NASA Astrophysics Data System (ADS)

    Kazakova, A. O.

    2018-05-01

    A systematic study of plane problems of the theory of polyharmonic functions is presented. A method of reducing boundary problems for polyharmonic functions to the system of integral equations on the boundary of the domain is given and a numerical algorithm for simulation of solutions of this system is suggested. Particular attention is paid to the numerical solution of the main tasks when the values of the function and its derivatives are given. Test examples are considered that confirm the effectiveness and accuracy of the suggested algorithm.

  7. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes weremore » used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge« less

  8. An Efficient Finite Element Framework to Assess Flexibility Performances of SMA Self-Expandable Carotid Artery Stents

    PubMed Central

    Ferraro, Mauro; Auricchio, Ferdinando; Boatti, Elisa; Scalet, Giulia; Conti, Michele; Morganti, Simone; Reali, Alessandro

    2015-01-01

    Computer-based simulations are nowadays widely exploited for the prediction of the mechanical behavior of different biomedical devices. In this aspect, structural finite element analyses (FEA) are currently the preferred computational tool to evaluate the stent response under bending. This work aims at developing a computational framework based on linear and higher order FEA to evaluate the flexibility of self-expandable carotid artery stents. In particular, numerical simulations involving large deformations and inelastic shape memory alloy constitutive modeling are performed, and the results suggest that the employment of higher order FEA allows accurately representing the computational domain and getting a better approximation of the solution with a widely-reduced number of degrees of freedom with respect to linear FEA. Moreover, when buckling phenomena occur, higher order FEA presents a superior capability of reproducing the nonlinear local effects related to buckling phenomena. PMID:26184329

  9. An Object Oriented Extensible Architecture for Affordable Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Lytle, John K. (Technical Monitor)

    2002-01-01

    Driven by a need to explore and develop propulsion systems that exceeded current computing capabilities, NASA Glenn embarked on a novel strategy leading to the development of an architecture that enables propulsion simulations never thought possible before. Full engine 3 Dimensional Computational Fluid Dynamic propulsion system simulations were deemed impossible due to the impracticality of the hardware and software computing systems required. However, with a software paradigm shift and an embracing of parallel and distributed processing, an architecture was designed to meet the needs of future propulsion system modeling. The author suggests that the architecture designed at the NASA Glenn Research Center for propulsion system modeling has potential for impacting the direction of development of affordable weapons systems currently under consideration by the Applied Vehicle Technology Panel (AVT). This paper discusses the salient features of the NPSS Architecture including its interface layer, object layer, implementation for accessing legacy codes, numerical zooming infrastructure and its computing layer. The computing layer focuses on the use and deployment of these propulsion simulations on parallel and distributed computing platforms which has been the focus of NASA Ames. Additional features of the object oriented architecture that support MultiDisciplinary (MD) Coupling, computer aided design (CAD) access and MD coupling objects will be discussed. Included will be a discussion of the successes, challenges and benefits of implementing this architecture.

  10. Using a million cell simulation of the cerebellum: network scaling and task generality.

    PubMed

    Li, Wen-Ke; Hausknecht, Matthew J; Stone, Peter; Mauk, Michael D

    2013-11-01

    Several factors combine to make it feasible to build computer simulations of the cerebellum and to test them in biologically realistic ways. These simulations can be used to help understand the computational contributions of various cerebellar components, including the relevance of the enormous number of neurons in the granule cell layer. In previous work we have used a simulation containing 12000 granule cells to develop new predictions and to account for various aspects of eyelid conditioning, a form of motor learning mediated by the cerebellum. Here we demonstrate the feasibility of scaling up this simulation to over one million granule cells using parallel graphics processing unit (GPU) technology. We observe that this increase in number of granule cells requires only twice the execution time of the smaller simulation on the GPU. We demonstrate that this simulation, like its smaller predecessor, can emulate certain basic features of conditioned eyelid responses, with a slight improvement in performance in one measure. We also use this simulation to examine the generality of the computation properties that we have derived from studying eyelid conditioning. We demonstrate that this scaled up simulation can learn a high level of performance in a classic machine learning task, the cart-pole balancing task. These results suggest that this parallel GPU technology can be used to build very large-scale simulations whose connectivity ratios match those of the real cerebellum and that these simulations can be used guide future studies on cerebellar mediated tasks and on machine learning problems. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Performance evaluation of GPU parallelization, space-time adaptive algorithms, and their combination for simulating cardiac electrophysiology.

    PubMed

    Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo

    2018-02-01

    The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Chemistry Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1983

    1983-01-01

    Presents chemistry experiments, laboratory procedures, demonstrations, teaching suggestions, and classroom materials/activities. These include: game for teaching ionic formulas; method for balancing equations; description of useful redox series; computer programs (with listings) for water electrolysis simulation and for determining chemical…

  13. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation.

    PubMed

    Fiore, Vincenzo G; Kottler, Benjamin; Gu, Xiaosi; Hirth, Frank

    2017-01-01

    The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features identify central complex circuitry, and especially the ellipsoid body, as a key neural correlate involved in spatial navigation.

  14. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation

    PubMed Central

    Fiore, Vincenzo G.; Kottler, Benjamin; Gu, Xiaosi; Hirth, Frank

    2017-01-01

    The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features identify central complex circuitry, and especially the ellipsoid body, as a key neural correlate involved in spatial navigation. PMID:28824390

  15. An Object Oriented Extensible Architecture for Affordable Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.

    2003-01-01

    Driven by a need to explore and develop propulsion systems that exceeded current computing capabilities, NASA Glenn embarked on a novel strategy leading to the development of an architecture that enables propulsion simulations never thought possible before. Full engine 3 Dimensional Computational Fluid Dynamic propulsion system simulations were deemed impossible due to the impracticality of the hardware and software computing systems required. However, with a software paradigm shift and an embracing of parallel and distributed processing, an architecture was designed to meet the needs of future propulsion system modeling. The author suggests that the architecture designed at the NASA Glenn Research Center for propulsion system modeling has potential for impacting the direction of development of affordable weapons systems currently under consideration by the Applied Vehicle Technology Panel (AVT).

  16. Multiscale Computer Simulation of Failure in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  17. Inexact hardware for modelling weather & climate

    NASA Astrophysics Data System (ADS)

    Düben, Peter D.; McNamara, Hugh; Palmer, Tim

    2014-05-01

    The use of stochastic processing hardware and low precision arithmetic in atmospheric models is investigated. Stochastic processors allow hardware-induced faults in calculations, sacrificing exact calculations in exchange for improvements in performance and potentially accuracy and a reduction in power consumption. A similar trade-off is achieved using low precision arithmetic, with improvements in computation and communication speed and savings in storage and memory requirements. As high-performance computing becomes more massively parallel and power intensive, these two approaches may be important stepping stones in the pursuit of global cloud resolving atmospheric modelling. The impact of both, hardware induced faults and low precision arithmetic is tested in the dynamical core of a global atmosphere model. Our simulations show that both approaches to inexact calculations do not substantially affect the quality of the model simulations, provided they are restricted to act only on smaller scales. This suggests that inexact calculations at the small scale could reduce computation and power costs without adversely affecting the quality of the simulations.

  18. The potential value of Clostridium difficile vaccine: an economic computer simulation model.

    PubMed

    Lee, Bruce Y; Popovich, Michael J; Tian, Ye; Bailey, Rachel R; Ufberg, Paul J; Wiringa, Ann E; Muder, Robert R

    2010-07-19

    Efforts are currently underway to develop a vaccine against Clostridium difficile infection (CDI). We developed two decision analytic Monte Carlo computer simulation models: (1) an Initial Prevention Model depicting the decision whether to administer C. difficile vaccine to patients at-risk for CDI and (2) a Recurrence Prevention Model depicting the decision whether to administer C. difficile vaccine to prevent CDI recurrence. Our results suggest that a C. difficile vaccine could be cost-effective over a wide range of C. difficile risk, vaccine costs, and vaccine efficacies especially, when being used post-CDI treatment to prevent recurrent disease. (c) 2010 Elsevier Ltd. All rights reserved.

  19. Ligand Binding: Molecular Mechanics Calculation of the Streptavidin-Biotin Rupture Force

    NASA Astrophysics Data System (ADS)

    Grubmuller, Helmut; Heymann, Berthold; Tavan, Paul

    1996-02-01

    The force required to rupture the streptavidin-biotin complex was calculated here by computer simulations. The computed force agrees well with that obtained by recent single molecule atomic force microscope experiments. These simulations suggest a detailed multiple-pathway rupture mechanism involving five major unbinding steps. Binding forces and specificity are attributed to a hydrogen bond network between the biotin ligand and residues within the binding pocket of streptavidin. During rupture, additional water bridges substantially enhance the stability of the complex and even dominate the binding inter-actions. In contrast, steric restraints do not appear to contribute to the binding forces, although conformational motions were observed.

  20. The Potential Value of Clostridium difficile Vaccine: An Economic Computer Simulation Model

    PubMed Central

    Lee, Bruce Y.; Popovich, Michael J.; Tian, Ye; Bailey, Rachel R.; Ufberg, Paul J.; Wiringa, Ann E.; Muder, Robert R.

    2010-01-01

    Efforts are currently underway to develop a vaccine against Clostridium difficile infection (CDI). We developed two decision analytic Monte Carlo computer simulation models: (1) an Initial Prevention Model depicting the decision whether to administer C. difficile vaccine to patients at-risk for CDI and (2) a Recurrence Prevention Model depicting the decision whether to administer C. difficile vaccine to prevent CDI recurrence. Our results suggest that a C. difficile vaccine could be cost-effective over a wide range of C. difficile risk, vaccine costs, and vaccine efficacies especially when being used post-CDI treatment to prevent recurrent disease. PMID:20541582

  1. Integrating the Apache Big Data Stack with HPC for Big Data

    NASA Astrophysics Data System (ADS)

    Fox, G. C.; Qiu, J.; Jha, S.

    2014-12-01

    There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.

  2. Molecular Dynamics Simulations of Ion Transport and Mechanisms in Polymer Nanocomposites

    NASA Astrophysics Data System (ADS)

    Mogurampelly, Santosh; Ganesan, Venkat

    2015-03-01

    Using all atom molecular dynamics and trajectory-extending kinetic Monte Carlo simulations, we study the influence of Al2O3 nanoparticles on the transport properties of Li+ ions in polymer electrolytes consisting of polyethylene oxide (PEO) melt solvated with LiBF4 salt. We observe that the nanoparticles have a strong influence on polymer segmental dynamics which in turn correlates with the mobility of Li+ ions. Explicitly, polymer segmental relaxation times and Li+ ion residence times around polymer were found to increase with the addition of nanoparticles. We also observe that increasing short range repulsive interactions between nanoparticles and polymer membrane leads to increasing polymer dynamics and ion mobility. Overall, our simulation results suggest that nanoparticle induced changes in conformational and dynamic properties of the polymer influences the ion mobilities in polymer electrolytes and suggests possible directions for using such findings to improve the polymer matrix conductivity. The authors acknowledge the Texas Advanced Computing Center (TACC) at The University of Texas at Austin for providing computing resources that have contributed to the research.

  3. Collecting data from a sensor network in a single-board computer

    NASA Astrophysics Data System (ADS)

    Casciati, F.; Casciati, S.; Chen, Z.-C.; Faravelli, L.; Vece, M.

    2015-07-01

    The EU-FP7 project SPARTACUS, currently in progress, sees the international cooperation of several partners toward the design and implementation of a satellite based asset tracking for supporting emergency management in crisis operations. Due to the emergency environment, one has to rely on a low power consumption wireless communication. Therefore, the communication hardware and software must be designed to match requirements which can only be foreseen at the level of more or less likely scenarios. The latter aspect suggests a deep use of a simulator (instead of a real network of sensors) to cover extreme situations. The former power consumption remark suggests the use of a minimal computer (Raspberry Pi) as data collector. In this paper, the results of a broad simulation campaign are reported in order to investigate the accuracy of the received data and the global power consumption for each of the considered scenarios.

  4. Computer simulation of the effects of shoe cushioning on internal and external loading during running impacts.

    PubMed

    Miller, Ross H; Hamill, Joseph

    2009-08-01

    Biomechanical aspects of running injuries are often inferred from external loading measurements. However, previous research has suggested that relationships between external loading and potential injury-inducing internal loads can be complex and nonintuitive. Further, the loading response to training interventions can vary widely between subjects. In this study, we use a subject-specific computer simulation approach to estimate internal and external loading of the distal tibia during the impact phase for two runners when running in shoes with different midsole cushioning parameters. The results suggest that: (1) changes in tibial loading induced by footwear are not reflected by changes in ground reaction force (GRF) magnitudes; (2) the GRF loading rate is a better surrogate measure of tibial loading and stress fracture risk than the GRF magnitude; and (3) averaging results across groups may potentially mask differential responses to training interventions between individuals.

  5. Simulation of Combustion Systems with Realistic g-jitter

    NASA Technical Reports Server (NTRS)

    Mell, William E.; McGrattan, Kevin B.; Baum, Howard R.

    2003-01-01

    In this project a transient, fully three-dimensional computer simulation code was developed to simulate the effects of realistic g-jitter on a number of combustion systems. The simulation code is capable of simulating flame spread on a solid and nonpremixed or premixed gaseous combustion in nonturbulent flow with simple combustion models. Simple combustion models were used to preserve computational efficiency since this is meant to be an engineering code. Also, the use of sophisticated turbulence models was not pursued (a simple Smagorinsky type model can be implemented if deemed appropriate) because if flow velocities are large enough for turbulence to develop in a reduced gravity combustion scenario it is unlikely that g-jitter disturbances (in NASA's reduced gravity facilities) will play an important role in the flame dynamics. Acceleration disturbances of realistic orientation, magnitude, and time dependence can be easily included in the simulation. The simulation algorithm was based on techniques used in an existing large eddy simulation code which has successfully simulated fire dynamics in complex domains. A series of simulations with measured and predicted acceleration disturbances on the International Space Station (ISS) are presented. The results of this series of simulations suggested a passive isolation system and appropriate scheduling of crew activity would provide a sufficiently "quiet" acceleration environment for spherical diffusion flames.

  6. Hydrogen bonds and twist in cellulose microfibrils.

    PubMed

    Kannam, Sridhar Kumar; Oehme, Daniel P; Doblin, Monika S; Gidley, Michael J; Bacic, Antony; Downton, Matthew T

    2017-11-01

    There is increasing experimental and computational evidence that cellulose microfibrils can exist in a stable twisted form. In this study, atomistic molecular dynamics (MD) simulations are performed to investigate the importance of intrachain hydrogen bonds on the twist in cellulose microfibrils. We systematically enforce or block the formation of these intrachain hydrogen bonds by either constraining dihedral angles or manipulating charges. For the majority of simulations a consistent right handed twist is observed. The exceptions are two sets of simulations that block the O2-O6' intrachain hydrogen bond, where no consistent twist is observed in multiple independent simulations suggesting that the O2-O6' hydrogen bond can drive twist. However, in a further simulation where exocyclic group rotation is also blocked, right-handed twist still develops suggesting that intrachain hydrogen bonds are not necessary to drive twist in cellulose microfibrils. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Integrating interactive computational modeling in biology curricula.

    PubMed

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  8. Further developments in cloud statistics for computer simulations

    NASA Technical Reports Server (NTRS)

    Chang, D. T.; Willand, J. H.

    1972-01-01

    This study is a part of NASA's continued program to provide global statistics of cloud parameters for computer simulation. The primary emphasis was on the development of the data bank of the global statistical distributions of cloud types and cloud layers and their applications in the simulation of the vertical distributions of in-cloud parameters such as liquid water content. These statistics were compiled from actual surface observations as recorded in Standard WBAN forms. Data for a total of 19 stations were obtained and reduced. These stations were selected to be representative of the 19 primary cloud climatological regions defined in previous studies of cloud statistics. Using the data compiled in this study, a limited study was conducted of the hemogeneity of cloud regions, the latitudinal dependence of cloud-type distributions, the dependence of these statistics on sample size, and other factors in the statistics which are of significance to the problem of simulation. The application of the statistics in cloud simulation was investigated. In particular, the inclusion of the new statistics in an expanded multi-step Monte Carlo simulation scheme is suggested and briefly outlined.

  9. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    NASA Astrophysics Data System (ADS)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  10. Using Molecular Dynamics Simulations as an Aid in the Prediction of Domain Swapping of Computationally Designed Protein Variants.

    PubMed

    Mou, Yun; Huang, Po-Ssu; Thomas, Leonard M; Mayo, Stephen L

    2015-08-14

    In standard implementations of computational protein design, a positive-design approach is used to predict sequences that will be stable on a given backbone structure. Possible competing states are typically not considered, primarily because appropriate structural models are not available. One potential competing state, the domain-swapped dimer, is especially compelling because it is often nearly identical with its monomeric counterpart, differing by just a few mutations in a hinge region. Molecular dynamics (MD) simulations provide a computational method to sample different conformational states of a structure. Here, we tested whether MD simulations could be used as a post-design screening tool to identify sequence mutations leading to domain-swapped dimers. We hypothesized that a successful computationally designed sequence would have backbone structure and dynamics characteristics similar to that of the input structure and that, in contrast, domain-swapped dimers would exhibit increased backbone flexibility and/or altered structure in the hinge-loop region to accommodate the large conformational change required for domain swapping. While attempting to engineer a homodimer from a 51-amino-acid fragment of the monomeric protein engrailed homeodomain (ENH), we had instead generated a domain-swapped dimer (ENH_DsD). MD simulations on these proteins showed increased B-factors derived from MD simulation in the hinge loop of the ENH_DsD domain-swapped dimer relative to monomeric ENH. Two point mutants of ENH_DsD designed to recover the monomeric fold were then tested with an MD simulation protocol. The MD simulations suggested that one of these mutants would adopt the target monomeric structure, which was subsequently confirmed by X-ray crystallography. Copyright © 2015. Published by Elsevier Ltd.

  11. Multi-phase models for water and thermal management of proton exchange membrane fuel cell: A review

    NASA Astrophysics Data System (ADS)

    Zhang, Guobin; Jiao, Kui

    2018-07-01

    The 3D (three-dimensional) multi-phase CFD (computational fluid dynamics) model is widely utilized in optimizing water and thermal management of PEM (proton exchange membrane) fuel cell. However, a satisfactory 3D multi-phase CFD model which is able to simulate the detailed gas and liquid two-phase flow in channels and reflect its effect on performance precisely is still not developed due to the coupling difficulties and computation amount. Meanwhile, the agglomerate model of CL (catalyst layer) should also be added in 3D CFD model so as to better reflect the concentration loss and optimize CL structure in macroscopic scale. Besides, the effect of thermal management is perhaps underestimated in current 3D multi-phase CFD simulations due to the lack of coolant channel in computation domain and constant temperature boundary condition. Therefore, the 3D CFD simulations in cell and stack levels with convection boundary condition are suggested to simulate the water and thermal management more accurately. Nevertheless, with the rapid development of PEM fuel cell, current 3D CFD simulations are far from practical demand, especially at high current density and low to zero humidity and for the novel designs developed recently, such as: metal foam flow field, 3D fine mesh flow field, anode circulation etc.

  12. Molecular dynamics simulations of collision-induced absorption: Implementation in LAMMPS

    NASA Astrophysics Data System (ADS)

    Fakhardji, W.; Gustafsson, M.

    2017-02-01

    We pursue simulations of collision-induced absorption in a mixture of argon and xenon gas at room temperature by means of classical molecular dynamics. The established theoretical approach (Hartmann et al. 2011 J. Chem. Phys. 134 094316) is implemented with the molecular dynamics package LAMMPS. The bound state features in the absorption spectrum are well reproduced with the molecular dynamics simulation in comparison with a laboratory measurement. The magnitude of the computed absorption, however, is underestimated in a large part of the spectrum. We suggest some aspects of the simulation that could be improved.

  13. Multiple point statistical simulation using uncertain (soft) conditional data

    NASA Astrophysics Data System (ADS)

    Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou

    2018-05-01

    Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.

  14. Li14P2O3N6 and Li7PN4: Computational study of two nitrogen rich crystalline LiPON electrolyte materials

    NASA Astrophysics Data System (ADS)

    Al-Qawasmeh, Ahmad; Holzwarth, N. A. W.

    2017-10-01

    Two lithium oxonitridophosphate materials are computationally examined and found to be promising solid electrolytes for possible use in all solid-state batteries having metallic Li anodes - Li14P2O3N6 and Li7PN4. The first principles simulations are in good agreement with the structural analyses reported in the literature for these materials and the computed total energies indicate that both materials are stable with respect to decomposition into binary and ternary products. The computational results suggest that both materials are likely to form metastable interfaces with Li metal. The simulations also find both materials to have Li ion migration activation energies comparable or smaller than those of related Li ion electrolyte materials. Specifically, for Li7PN4, the experimentally measured activation energy can be explained by the migration of a Li ion vacancy stabilized by a small number of O2- ions substituting for N3- ions. For Li14P2O3N6, the activation energy for Li ion migration has not yet been experimentally measured, but simulations predict it to be smaller than that measured for Li7PN4.

  15. Supervised learning from human performance at the computationally hard problem of optimal traffic signal control on a network of junctions

    PubMed Central

    Box, Simon

    2014-01-01

    Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human ‘player’ to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable. PMID:26064570

  16. Supervised learning from human performance at the computationally hard problem of optimal traffic signal control on a network of junctions.

    PubMed

    Box, Simon

    2014-12-01

    Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human 'player' to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable.

  17. A computational model for telomere-dependent cell-replicative aging.

    PubMed

    Portugal, R D; Land, M G P; Svaiter, B F

    2008-01-01

    Telomere shortening provides a molecular basis for the Hayflick limit. Recent data suggest that telomere shortening also influence mitotic rate. We propose a stochastic growth model of this phenomena, assuming that cell division in each time interval is a random process which probability decreases linearly with telomere shortening. Computer simulations of the proposed stochastic telomere-regulated model provides good approximation of the qualitative growth of cultured human mesenchymal stem cells.

  18. Computational prediction of ionic liquid 1-octanol/water partition coefficients.

    PubMed

    Kamath, Ganesh; Bhatnagar, Navendu; Baker, Gary A; Baker, Sheila N; Potoff, Jeffrey J

    2012-04-07

    Wet 1-octanol/water partition coefficients (log K(ow)) predicted for imidazolium-based ionic liquids using adaptive bias force-molecular dynamics (ABF-MD) simulations lie in excellent agreement with experimental values. These encouraging results suggest prospects for this computational tool in the a priori prediction of log K(ow) values of ionic liquids broadly with possible screening implications as well (e.g., prediction of CO(2)-philic ionic liquids).

  19. Streaming parallel GPU acceleration of large-scale filter-based spiking neural networks.

    PubMed

    Slażyński, Leszek; Bohte, Sander

    2012-01-01

    The arrival of graphics processing (GPU) cards suitable for massively parallel computing promises affordable large-scale neural network simulation previously only available at supercomputing facilities. While the raw numbers suggest that GPUs may outperform CPUs by at least an order of magnitude, the challenge is to develop fine-grained parallel algorithms to fully exploit the particulars of GPUs. Computation in a neural network is inherently parallel and thus a natural match for GPU architectures: given inputs, the internal state for each neuron can be updated in parallel. We show that for filter-based spiking neurons, like the Spike Response Model, the additive nature of membrane potential dynamics enables additional update parallelism. This also reduces the accumulation of numerical errors when using single precision computation, the native precision of GPUs. We further show that optimizing simulation algorithms and data structures to the GPU's architecture has a large pay-off: for example, matching iterative neural updating to the memory architecture of the GPU speeds up this simulation step by a factor of three to five. With such optimizations, we can simulate in better-than-realtime plausible spiking neural networks of up to 50 000 neurons, processing over 35 million spiking events per second.

  20. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Appendix A: ROBSIM user's guide

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelley, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.

    1986-01-01

    The purpose of the Robotics Simulation Program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotics systems. ROBSIM is program in FORTRAN 77 for use on a VAX 11/750 computer under the VMS operating system. This user's guide describes the capabilities of the ROBSIM programs, including the system definition function, the analysis tools function and the postprocessor function. The options a user may encounter with each of these executables are explained in detail and the different program prompts appearing to the user are included. Some useful suggestions concerning the appropriate answers to be given by the user are provided. An example user interactive run in enclosed for each of the main program services, and some of the capabilities are illustrated.

  1. Computer-based simulation training to improve learning outcomes in mannequin-based simulation exercises.

    PubMed

    Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J

    2011-08-10

    To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.

  2. Brief Report: Simulations Suggest Heterogeneous Category Learning and Generalization in Children with Autism Is a Result of Idiosyncratic Perceptual Transformations

    ERIC Educational Resources Information Center

    Mercado, Eduardo, III; Church, Barbara A.

    2016-01-01

    Children with autism spectrum disorder (ASD) sometimes have difficulties learning categories. Past computational work suggests that such deficits may result from atypical representations in cortical maps. Here we use neural networks to show that idiosyncratic transformations of inputs can result in the formation of feature maps that impair…

  3. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  4. Skin fluorescence model based on the Monte Carlo technique

    NASA Astrophysics Data System (ADS)

    Churmakov, Dmitry Y.; Meglinski, Igor V.; Piletsky, Sergey A.; Greenhalgh, Douglas A.

    2003-10-01

    The novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account spatial distribution of fluorophores following the collagen fibers packing, whereas in epidermis and stratum corneum the distribution of fluorophores assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the NIR spectral region, while fluorescence of sensor layer embedded in epidermis is localized at the adjusted depth. The model is also able to simulate the skin fluorescence spectra.

  5. Recruitment of Foreigners in the Market for Computer Scientists in the United States

    PubMed Central

    Bound, John; Braga, Breno; Golden, Joseph M.

    2016-01-01

    We present and calibrate a dynamic model that characterizes the labor market for computer scientists. In our model, firms can recruit computer scientists from recently graduated college students, from STEM workers working in other occupations or from a pool of foreign talent. Counterfactual simulations suggest that wages for computer scientists would have been 2.8–3.8% higher, and the number of Americans employed as computers scientists would have been 7.0–13.6% higher in 2004 if firms could not hire more foreigners than they could in 1994. In contrast, total CS employment would have been 3.8–9.0% lower, and consequently output smaller. PMID:27170827

  6. Parameter Estimation for a Pulsating Turbulent Buoyant Jet Using Approximate Bayesian Computation

    NASA Astrophysics Data System (ADS)

    Christopher, Jason; Wimer, Nicholas; Lapointe, Caelan; Hayden, Torrey; Grooms, Ian; Rieker, Greg; Hamlington, Peter

    2017-11-01

    Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other ``truth'' data to be used for the prediction of unknown parameters, such as flow properties and boundary conditions, in numerical simulations of real-world engineering systems. Here we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a direct numerical simulation (DNS) with known boundary conditions and problem parameters, while the ABC procedure utilizes lower fidelity large eddy simulations. Using spatially-sparse statistics from the 2D buoyant jet DNS, we show that the ABC method provides accurate predictions of true jet inflow parameters. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for predicting flow information, such as boundary conditions, that can be difficult to determine experimentally.

  7. Eulerian-Lagrangian Simulations of Transonic Flutter Instabilities

    NASA Technical Reports Server (NTRS)

    Bendiksen, Oddvar O.

    1994-01-01

    This paper presents an overview of recent applications of Eulerian-Lagrangian computational schemes in simulating transonic flutter instabilities. This approach, the fluid-structure system is treated as a single continuum dynamics problem, by switching from an Eulerian to a Lagrangian formulation at the fluid-structure boundary. This computational approach effectively eliminates the phase integration errors associated with previous methods, where the fluid and structure are integrated sequentially using different schemes. The formulation is based on Hamilton's Principle in mixed coordinates, and both finite volume and finite element discretization schemes are considered. Results from numerical simulations of transonic flutter instabilities are presented for isolated wings, thin panels, and turbomachinery blades. The results suggest that the method is capable of reproducing the energy exchange between the fluid and the structure with significantly less error than existing methods. Localized flutter modes and panel flutter modes involving traveling waves can also be simulated effectively with no a priori knowledge of the type of instability involved.

  8. A scalable PC-based parallel computer for lattice QCD

    NASA Astrophysics Data System (ADS)

    Fodor, Z.; Katz, S. D.; Pappa, G.

    2003-05-01

    A PC-based parallel computer for medium/large scale lattice QCD simulations is suggested. The Eo¨tvo¨s Univ., Inst. Theor. Phys. cluster consists of 137 Intel P4-1.7GHz nodes. Gigabit Ethernet cards are used for nearest neighbor communication in a two-dimensional mesh. The sustained performance for dynamical staggered (wilson) quarks on large lattices is around 70(110) GFlops. The exceptional price/performance ratio is below $1/Mflop.

  9. Requirements for Large Eddy Simulation Computations of Variable-Speed Power Turbine Flows

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2016-01-01

    Variable-speed power turbines (VSPTs) operate at low Reynolds numbers and with a wide range of incidence angles. Transition, separation, and the relevant physics leading to them are important to VSPT flow. Higher fidelity tools such as large eddy simulation (LES) may be needed to resolve the flow features necessary for accurate predictive capability and design of such turbines. A survey conducted for this report explores the requirements for such computations. The survey is limited to the simulation of two-dimensional flow cases and endwalls are not included. It suggests that a grid resolution necessary for this type of simulation to accurately represent the physics may be of the order of Delta(x)+=45, Delta(x)+ =2 and Delta(z)+=17. Various subgrid-scale (SGS) models have been used and except for the Smagorinsky model, all seem to perform well and in some instances the simulations worked well without SGS modeling. A method of specifying the inlet conditions such as synthetic eddy modeling (SEM) is necessary to correctly represent the inlet conditions.

  10. Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    NASA Astrophysics Data System (ADS)

    Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan

    2014-03-01

    We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.

  11. A breakthrough for experiencing and understanding simulated physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1988-01-01

    The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.

  12. A Guide for Developing Human-Robot Interaction Experiments in the Robotic Interactive Visualization and Experimentation Technology (RIVET) Simulation

    DTIC Science & Technology

    2016-05-01

    research, Kunkler (2006) suggested that the similarities between computer simulation tools and robotic surgery systems (e.g., mechanized feedback...distribution is unlimited. 49 Davies B. A review of robotics in surgery . Proceedings of the Institution of Mechanical Engineers, Part H: Journal...ARL-TR-7683 ● MAY 2016 US Army Research Laboratory A Guide for Developing Human- Robot Interaction Experiments in the Robotic

  13. Computational Modeling of 3D Tumor Growth and Angiogenesis for Chemotherapy Evaluation

    PubMed Central

    Tang, Lei; van de Ven, Anne L.; Guo, Dongmin; Andasari, Vivi; Cristini, Vittorio; Li, King C.; Zhou, Xiaobo

    2014-01-01

    Solid tumors develop abnormally at spatial and temporal scales, giving rise to biophysical barriers that impact anti-tumor chemotherapy. This may increase the expenditure and time for conventional drug pharmacokinetic and pharmacodynamic studies. In order to facilitate drug discovery, we propose a mathematical model that couples three-dimensional tumor growth and angiogenesis to simulate tumor progression for chemotherapy evaluation. This application-oriented model incorporates complex dynamical processes including cell- and vascular-mediated interstitial pressure, mass transport, angiogenesis, cell proliferation, and vessel maturation to model tumor progression through multiple stages including tumor initiation, avascular growth, and transition from avascular to vascular growth. Compared to pure mechanistic models, the proposed empirical methods are not only easy to conduct but can provide realistic predictions and calculations. A series of computational simulations were conducted to demonstrate the advantages of the proposed comprehensive model. The computational simulation results suggest that solid tumor geometry is related to the interstitial pressure, such that tumors with high interstitial pressure are more likely to develop dendritic structures than those with low interstitial pressure. PMID:24404145

  14. Application of an interactive water simulation model in urban water management: a case study in Amsterdam.

    PubMed

    Leskens, J G; Brugnach, M; Hoekstra, A Y

    2014-01-01

    Water simulation models are available to support decision-makers in urban water management. To use current water simulation models, special expertise is required. Therefore, model information is prepared prior to work sessions, in which decision-makers weigh different solutions. However, this model information quickly becomes outdated when new suggestions for solutions arise and are therefore limited in use. We suggest that new model techniques, i.e. fast and flexible computation algorithms and realistic visualizations, allow this problem to be solved by using simulation models during work sessions. A new Interactive Water Simulation Model was applied for two case study areas in Amsterdam and was used in two workshops. In these workshops, the Interactive Water Simulation Model was positively received. It included non-specialist participants in the process of suggesting and selecting possible solutions and made them part of the accompanying discussions and negotiations. It also provided the opportunity to evaluate and enhance possible solutions more often within the time horizon of a decision-making process. Several preconditions proved to be important for successfully applying the Interactive Water Simulation Model, such as the willingness of the stakeholders to participate and the preparation of different general main solutions that can be used for further iterations during a work session.

  15. Computational Wear Simulation of Patellofemoral Articular Cartilage during In Vitro Testing

    PubMed Central

    Li, Lingmin; Patil, Shantanu; Steklov, Nick; Bae, Won; Temple-Wong, Michele; D'Lima, Darryl D.; Sah, Robert L.; Fregly, Benjamin J.

    2011-01-01

    Though changes in normal joint motions and loads (e.g., following anterior cruciate ligament injury) contribute to the development of knee osteoarthritis, the precise mechanism by which these changes induce osteoarthritis remains unknown. As a first step toward identifying this mechanism, this study evaluates computational wear simulations of a patellofemoral joint specimen wear tested on a knee simulator machine. A multi-body dynamic model of the specimen mounted in the simulator machine was constructed in commercial computer-aided engineering software. A custom elastic foundation contact model was used to calculate contact pressures and wear on the femoral and patellar articular surfaces using geometry created from laser scan and MR data. Two different wear simulation approaches were investigated – one that wore the surface geometries gradually over a sequence of 10 one-cycle dynamic simulations (termed the “progressive” approach), and one that wore the surface geometries abruptly using results from a single one-cycle dynamic simulation (termed the “non-progressive” approach). The progressive approach with laser scan geometry reproduced the experimentally measured wear depths and areas for both the femur and patella. The less costly non-progressive approach predicted deeper wear depths, especially on the patella, but had little influence on predicted wear areas. Use of MR data for creating the articular and subchondral bone geometry altered wear depth and area predictions by at most 13%. These results suggest that MR-derived geometry may be sufficient for simulating articular cartilage wear in vivo and that a progressive simulation approach may be needed for the patella and tibia since both remain in continuous contact with the femur. PMID:21453922

  16. Computational wear simulation of patellofemoral articular cartilage during in vitro testing.

    PubMed

    Li, Lingmin; Patil, Shantanu; Steklov, Nick; Bae, Won; Temple-Wong, Michele; D'Lima, Darryl D; Sah, Robert L; Fregly, Benjamin J

    2011-05-17

    Though changes in normal joint motions and loads (e.g., following anterior cruciate ligament injury) contribute to the development of knee osteoarthritis, the precise mechanism by which these changes induce osteoarthritis remains unknown. As a first step toward identifying this mechanism, this study evaluates computational wear simulations of a patellofemoral joint specimen wear tested on a knee simulator machine. A multibody dynamic model of the specimen mounted in the simulator machine was constructed in commercial computer-aided engineering software. A custom elastic foundation contact model was used to calculate contact pressures and wear on the femoral and patellar articular surfaces using geometry created from laser scan and MR data. Two different wear simulation approaches were investigated--one that wore the surface geometries gradually over a sequence of 10 one-cycle dynamic simulations (termed the "progressive" approach), and one that wore the surface geometries abruptly using results from a single one-cycle dynamic simulation (termed the "non-progressive" approach). The progressive approach with laser scan geometry reproduced the experimentally measured wear depths and areas for both the femur and patella. The less costly non-progressive approach predicted deeper wear depths, especially on the patella, but had little influence on predicted wear areas. Use of MR data for creating the articular and subchondral bone geometry altered wear depth and area predictions by at most 13%. These results suggest that MR-derived geometry may be sufficient for simulating articular cartilage wear in vivo and that a progressive simulation approach may be needed for the patella and tibia since both remain in continuous contact with the femur. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Computational Models Reveal a Passive Mechanism for Cell Migration in the Crypt

    PubMed Central

    Dunn, Sara-Jane; Näthke, Inke S.; Osborne, James M.

    2013-01-01

    Cell migration in the intestinal crypt is essential for the regular renewal of the epithelium, and the continued upward movement of cells is a key characteristic of healthy crypt dynamics. However, the driving force behind this migration is unknown. Possibilities include mitotic pressure, active movement driven by motility cues, or negative pressure arising from cell loss at the crypt collar. It is possible that a combination of factors together coordinate migration. Here, three different computational models are used to provide insight into the mechanisms that underpin cell movement in the crypt, by examining the consequence of eliminating cell division on cell movement. Computational simulations agree with existing experimental results, confirming that migration can continue in the absence of mitosis. Importantly, however, simulations allow us to infer mechanisms that are sufficient to generate cell movement, which is not possible through experimental observation alone. The results produced by the three models agree and suggest that cell loss due to apoptosis and extrusion at the crypt collar relieves cell compression below, allowing cells to expand and move upwards. This finding suggests that future experiments should focus on the role of apoptosis and cell extrusion in controlling cell migration in the crypt. PMID:24260407

  18. Symplectic molecular dynamics simulations on specially designed parallel computers.

    PubMed

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  19. A fast, open source implementation of adaptive biasing potentials uncovers a ligand design strategy for the chromatin regulator BRD4

    NASA Astrophysics Data System (ADS)

    Dickson, Bradley M.; de Waal, Parker W.; Ramjan, Zachary H.; Xu, H. Eric; Rothbart, Scott B.

    2016-10-01

    In this communication we introduce an efficient implementation of adaptive biasing that greatly improves the speed of free energy computation in molecular dynamics simulations. We investigated the use of accelerated simulations to inform on compound design using a recently reported and clinically relevant inhibitor of the chromatin regulator BRD4 (bromodomain-containing protein 4). Benchmarking on our local compute cluster, our implementation achieves up to 2.5 times more force calls per day than plumed2. Results of five 1 μs-long simulations are presented, which reveal a conformational switch in the BRD4 inhibitor between a binding competent and incompetent state. Stabilization of the switch led to a -3 kcal/mol improvement of absolute binding free energy. These studies suggest an unexplored ligand design principle and offer new actionable hypotheses for medicinal chemistry efforts against this druggable epigenetic target class.

  20. A fast, open source implementation of adaptive biasing potentials uncovers a ligand design strategy for the chromatin regulator BRD4

    PubMed Central

    Dickson, Bradley M.; Ramjan, Zachary H.; Xu, H. Eric

    2016-01-01

    In this communication we introduce an efficient implementation of adaptive biasing that greatly improves the speed of free energy computation in molecular dynamics simulations. We investigated the use of accelerated simulations to inform on compound design using a recently reported and clinically relevant inhibitor of the chromatin regulator BRD4 (bromodomain-containing protein 4). Benchmarking on our local compute cluster, our implementation achieves up to 2.5 times more force calls per day than plumed2. Results of five 1 μs-long simulations are presented, which reveal a conformational switch in the BRD4 inhibitor between a binding competent and incompetent state. Stabilization of the switch led to a −3 kcal/mol improvement of absolute binding free energy. These studies suggest an unexplored ligand design principle and offer new actionable hypotheses for medicinal chemistry efforts against this druggable epigenetic target class. PMID:27782467

  1. A Model of In vitro Plasticity at the Parallel Fiber—Molecular Layer Interneuron Synapses

    PubMed Central

    Lennon, William; Yamazaki, Tadashi; Hecht-Nielsen, Robert

    2015-01-01

    Theoretical and computational models of the cerebellum typically focus on the role of parallel fiber (PF)—Purkinje cell (PKJ) synapses for learned behavior, but few emphasize the role of the molecular layer interneurons (MLIs)—the stellate and basket cells. A number of recent experimental results suggest the role of MLIs is more important than previous models put forth. We investigate learning at PF—MLI synapses and propose a mathematical model to describe plasticity at this synapse. We perform computer simulations with this form of learning using a spiking neuron model of the MLI and show that it reproduces six in vitro experimental results in addition to simulating four novel protocols. Further, we show how this plasticity model can predict the results of other experimental protocols that are not simulated. Finally, we hypothesize what the biological mechanisms are for changes in synaptic efficacy that embody the phenomenological model proposed here. PMID:26733856

  2. New method of processing heat treatment experiments with numerical simulation support

    NASA Astrophysics Data System (ADS)

    Kik, T.; Moravec, J.; Novakova, I.

    2017-08-01

    In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.

  3. Computational Study of Symmetric Methylation on Histone Arginine Catalyzed by Protein Arginine Methyltransferase PRMT5 through QM/MM MD and Free Energy Simulations

    DOE PAGES

    Yue, Yufei; CHu, Yuzhuo; Guo, Hong

    2015-01-01

    Protein arginine methyltransferases (PRMTs) catalyze the transfer of the methyl group from S-adenosyl-l-methionine (AdoMet) to arginine residues. There are three types of PRMTs (I, II and III) that produce different methylation products, including asymmetric dimethylarginine (ADMA), symmetric dimethylarginine (SDMA) and monomethylarginine (MMA). Since these different methylations can lead to different biological consequences, understanding the origin of product specificity of PRMTs is of considerable interest. In this article, the quantum mechanical/molecular mechanical (QM/MM) molecular dynamics (MD) and free energy simulations are performed to study SDMA catalyzed by the Type II PRMT5 on the basis of experimental observation that the dimethylated productmore » is generated through a distributive fashion. The simulations have identified some important interactions and proton transfers during the catalysis. Similar to the cases involving Type I PRMTs, a conserved Glu residue (Glu435) in PRMT5 is suggested to function as general base catalyst based on the result of the simulations. Moreover, our results show that PRMT5 has an energetic preference for the first methylation on N-1 followed by the second methylation on a different -guanidino nitrogen of arginine (N-2).The first and second methyl transfers are estimated to have free energy barriers of 19-20 and 18-19 kcal/mol respectively. The computer simulations suggest a distinctive catalytic mechanism of symmetric dimethylation that seems to be different from asymmetric dimethylation.« less

  4. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  5. Courseware Review.

    ERIC Educational Resources Information Center

    Risley, John S.

    1983-01-01

    Reviews "Laws of Motion" computer program produced by Educational Materials and Equipment Company. The program (language unknown), for Apple II/II+, is a simulation of an inclined plane, free fall, and Atwood machine in Newtonian/Aristotelian worlds. Suggests use as supplement to discussion of motion by teacher who fully understands the…

  6. Feature Statistics Modulate the Activation of Meaning During Spoken Word Processing.

    PubMed

    Devereux, Barry J; Taylor, Kirsten I; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K

    2016-03-01

    Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in (distinctiveness/sharedness) and likelihood of co-occurrence (correlational strength)--determine conceptual activation. To test these claims, we investigated the role of distinctiveness/sharedness and correlational strength in speech-to-meaning mapping, using a lexical decision task and computational simulations. Responses were faster for concepts with higher sharedness, suggesting that shared features are facilitatory in tasks like lexical decision that require access to them. Correlational strength facilitated responses for slower participants, suggesting a time-sensitive co-occurrence-driven settling mechanism. The computational simulation showed similar effects, with early effects of shared features and later effects of correlational strength. These results support a general-to-specific account of conceptual processing, whereby early activation of shared features is followed by the gradual emergence of a specific target representation. Copyright © 2015 The Authors. Cognitive Science published by Cognitive Science Society, Inc.

  7. A simulation-based study on the influence of beam hardening in X-ray computed tomography for dimensional metrology.

    PubMed

    Lifton, Joseph J; Malcolm, Andrew A; McBride, John W

    2015-01-01

    X-ray computed tomography (CT) is a radiographic scanning technique for visualising cross-sectional images of an object non-destructively. From these cross-sectional images it is possible to evaluate internal dimensional features of a workpiece which may otherwise be inaccessible to tactile and optical instruments. Beam hardening is a physical process that degrades the quality of CT images and has previously been suggested to influence dimensional measurements. Using a validated simulation tool, the influence of spectrum pre-filtration and beam hardening correction are evaluated for internal and external dimensional measurements. Beam hardening is shown to influence internal and external dimensions in opposition, and to have a greater influence on outer dimensions compared to inner dimensions. The results suggest the combination of spectrum pre-filtration and a local gradient-based surface determination method are able to greatly reduce the influence of beam hardening in X-ray CT for dimensional metrology.

  8. Computational Fluid Dynamics of Developing Avian Outflow Tract Heart Valves

    PubMed Central

    Bharadwaj, Koonal N.; Spitz, Cassie; Shekhar, Akshay; Yalcin, Huseyin C.; Butcher, Jonathan T.

    2012-01-01

    Hemodynamic forces play an important role in sculpting the embryonic heart and its valves. Alteration of blood flow patterns through the hearts of embryonic animal models lead to malformations that resemble some clinical congenital heart defects, but the precise mechanisms are poorly understood. Quantitative understanding of the local fluid forces acting in the heart has been elusive because of the extremely small and rapidly changing anatomy. In this study, we combine multiple imaging modalities with computational simulation to rigorously quantify the hemodynamic environment within the developing outflow tract (OFT) and its eventual aortic and pulmonary valves. In vivo Doppler ultrasound generated velocity profiles were applied to Micro-Computed Tomography generated 3D OFT lumen geometries from Hamburger-Hamilton (HH) stage 16 to 30 chick embryos. Computational fluid dynamics simulation initial conditions were iterated until local flow profiles converged with in vivo Doppler flow measurements. Results suggested that flow in the early tubular OFT (HH16 and HH23) was best approximated by Poiseuille flow, while later embryonic OFT septation (HH27, HH30) was mimicked by plug flow conditions. Peak wall shear stress (WSS) values increased from 18.16 dynes/cm2 at HH16 to 671.24 dynes/cm2 at HH30. Spatiotemporally averaged WSS values also showed a monotonic increase from 3.03 dynes/cm2 at HH16 to 136.50 dynes/cm2 at HH30. Simulated velocity streamlines in the early heart suggest a lack of mixing, which differed from classical ink injections. Changes in local flow patterns preceded and correlated with key morphogenetic events such as OFT septation and valve formation. This novel method to quantify local dynamic hemodynamics parameters affords insight into sculpting role of blood flow in the embryonic heart and provides a quantitative baseline dataset for future research. PMID:22535311

  9. Learning and evolution in bacterial taxis: an operational amplifier circuit modeling the computational dynamics of the prokaryotic 'two component system' protein network.

    PubMed

    Di Paola, Vieri; Marijuán, Pedro C; Lahoz-Beltra, Rafael

    2004-01-01

    Adaptive behavior in unicellular organisms (i.e., bacteria) depends on highly organized networks of proteins governing purposefully the myriad of molecular processes occurring within the cellular system. For instance, bacteria are able to explore the environment within which they develop by utilizing the motility of their flagellar system as well as a sophisticated biochemical navigation system that samples the environmental conditions surrounding the cell, searching for nutrients or moving away from toxic substances or dangerous physical conditions. In this paper we discuss how proteins of the intervening signal transduction network could be modeled as artificial neurons, simulating the dynamical aspects of the bacterial taxis. The model is based on the assumption that, in some important aspects, proteins can be considered as processing elements or McCulloch-Pitts artificial neurons that transfer and process information from the bacterium's membrane surface to the flagellar motor. This simulation of bacterial taxis has been carried out on a hardware realization of a McCulloch-Pitts artificial neuron using an operational amplifier. Based on the behavior of the operational amplifier we produce a model of the interaction between CheY and FliM, elements of the prokaryotic two component system controlling chemotaxis, as well as a simulation of learning and evolution processes in bacterial taxis. On the one side, our simulation results indicate that, computationally, these protein 'switches' are similar to McCulloch-Pitts artificial neurons, suggesting a bridge between evolution and learning in dynamical systems at cellular and molecular levels and the evolutive hardware approach. On the other side, important protein 'tactilizing' properties are not tapped by the model, and this suggests further complexity steps to explore in the approach to biological molecular computing.

  10. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  11. Data mining through simulation.

    PubMed

    Lytton, William W; Stewart, Mark

    2007-01-01

    Data integration is particularly difficult in neuroscience; we must organize vast amounts of data around only a few fragmentary functional hypotheses. It has often been noted that computer simulation, by providing explicit hypotheses for a particular system and bridging across different levels of organization, can provide an organizational focus, which can be leveraged to form substantive hypotheses. Simulations lend meaning to data and can be updated and adapted as further data come in. The use of simulation in this context suggests the need for simulator adjuncts to manage and evaluate data. We have developed a neural query system (NQS) within the NEURON simulator, providing a relational database system, a query function, and basic data-mining tools. NQS is used within the simulation context to manage, verify, and evaluate model parameterizations. More importantly, it is used for data mining of simulation data and comparison with neurophysiology.

  12. A Structured-Inquiry Approach to Teaching Neurophysiology Using Computer Simulation

    PubMed Central

    Crisp, Kevin M.

    2012-01-01

    Computer simulation is a valuable tool for teaching the fundamentals of neurophysiology in undergraduate laboratories where time and equipment limitations restrict the amount of course content that can be delivered through hands-on interaction. However, students often find such exercises to be tedious and unstimulating. In an effort to engage students in the use of computational modeling while developing a deeper understanding of neurophysiology, an attempt was made to use an educational neurosimulation environment as the basis for a novel, inquiry-based research project. During the semester, students in the class wrote a research proposal, used the Neurodynamix II simulator to generate a large data set, analyzed their modeling results statistically, and presented their findings at the Midbrains Neuroscience Consortium undergraduate poster session. Learning was assessed in the form of a series of short term papers and two 10-min in-class writing responses to the open-ended question, “How do ion channels influence neuronal firing?”, which they completed on weeks 6 and 15 of the semester. Students’ answers to this question showed a deeper understanding of neuronal excitability after the project; their term papers revealed evidence of critical thinking about computational modeling and neuronal excitability. Suggestions for the adaptation of this structured-inquiry approach into shorter term lab experiences are discussed. PMID:23494064

  13. Optimization behavior of brainstem respiratory neurons. A cerebral neural network model.

    PubMed

    Poon, C S

    1991-01-01

    A recent model of respiratory control suggested that the steady-state respiratory responses to CO2 and exercise may be governed by an optimal control law in the brainstem respiratory neurons. It was not certain, however, whether such complex optimization behavior could be accomplished by a realistic biological neural network. To test this hypothesis, we developed a hybrid computer-neural model in which the dynamics of the lung, brain and other tissue compartments were simulated on a digital computer. Mimicking the "controller" was a human subject who pedalled on a bicycle with varying speed (analog of ventilatory output) with a view to minimize an analog signal of the total cost of breathing (chemical and mechanical) which was computed interactively and displayed on an oscilloscope. In this manner, the visuomotor cortex served as a proxy (homolog) of the brainstem respiratory neurons in the model. Results in 4 subjects showed a linear steady-state ventilatory CO2 response to arterial PCO2 during simulated CO2 inhalation and a nearly isocapnic steady-state response during simulated exercise. Thus, neural optimization is a plausible mechanism for respiratory control during exercise and can be achieved by a neural network with cognitive computational ability without the need for an exercise stimulus.

  14. Student Ability, Confidence, and Attitudes Toward Incorporating a Computer into a Patient Interview.

    PubMed

    Ray, Sarah; Valdovinos, Katie

    2015-05-25

    To improve pharmacy students' ability to effectively incorporate a computer into a simulated patient encounter and to improve their awareness of barriers and attitudes towards and their confidence in using a computer during simulated patient encounters. Students completed a survey that assessed their awareness of, confidence in, and attitudes towards computer use during simulated patient encounters. Students were evaluated with a rubric on their ability to incorporate a computer into a simulated patient encounter. Students were resurveyed and reevaluated after instruction. Students improved in their ability to effectively incorporate computer usage into a simulated patient encounter. They also became more aware of and improved their attitudes toward barriers regarding such usage and gained more confidence in their ability to use a computer during simulated patient encounters. Instruction can improve pharmacy students' ability to incorporate a computer into simulated patient encounters. This skill is critical to developing efficiency while maintaining rapport with patients.

  15. Comparative simulation study of chemical synthesis of functional DADNE material.

    PubMed

    Liu, Min Hsien; Liu, Chuan Wen

    2017-01-01

    Amorphous molecular simulation to model the reaction species in the synthesis of chemically inert and energetic 1,1-diamino-2,2-dinitroethene (DADNE) explosive material was performed in this work. Nitromethane was selected as the starting reactant to undergo halogenation, nitration, deprotonation, intermolecular condensation, and dehydration to produce the target DADNE product. The Materials Studio (MS) forcite program allowed fast energy calculations and reliable geometric optimization of all aqueous molecular reaction systems (0.1-0.5 M) at 283 K and 298 K. The MS forcite-computed and Gaussian polarizable continuum model (PCM)-computed results were analyzed and compared in order to explore feasible reaction pathways under suitable conditions for the synthesis of DADNE. Through theoretical simulation, the findings revealed that synthesis was possible, and a total energy barrier of 449.6 kJ mol -1 needed to be overcome in order to carry out the reaction according to MS calculation of the energy barriers at each stage at 283 K, as shown by the reaction profiles. Local analysis of intermolecular interaction, together with calculation of the stabilization energy of each reaction system, provided information that can be used as a reference regarding molecular integrated stability. Graphical Abstract Materials Studio software has been suggested for the computation and simulation of DADNE synthesis.

  16. Implementation of tetrahedral-mesh geometry in Monte Carlo radiation transport code PHITS

    NASA Astrophysics Data System (ADS)

    Furuta, Takuya; Sato, Tatsuhiko; Han, Min Cheol; Yeom, Yeon Soo; Kim, Chan Hyeong; Brown, Justin L.; Bolch, Wesley E.

    2017-06-01

    A new function to treat tetrahedral-mesh geometry was implemented in the particle and heavy ion transport code systems. To accelerate the computational speed in the transport process, an original algorithm was introduced to initially prepare decomposition maps for the container box of the tetrahedral-mesh geometry. The computational performance was tested by conducting radiation transport simulations of 100 MeV protons and 1 MeV photons in a water phantom represented by tetrahedral mesh. The simulation was repeated with varying number of meshes and the required computational times were then compared with those of the conventional voxel representation. Our results show that the computational costs for each boundary crossing of the region mesh are essentially equivalent for both representations. This study suggests that the tetrahedral-mesh representation offers not only a flexible description of the transport geometry but also improvement of computational efficiency for the radiation transport. Due to the adaptability of tetrahedrons in both size and shape, dosimetrically equivalent objects can be represented by tetrahedrons with a much fewer number of meshes as compared its voxelized representation. Our study additionally included dosimetric calculations using a computational human phantom. A significant acceleration of the computational speed, about 4 times, was confirmed by the adoption of a tetrahedral mesh over the traditional voxel mesh geometry.

  17. Implementation of tetrahedral-mesh geometry in Monte Carlo radiation transport code PHITS.

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko; Han, Min Cheol; Yeom, Yeon Soo; Kim, Chan Hyeong; Brown, Justin L; Bolch, Wesley E

    2017-06-21

    A new function to treat tetrahedral-mesh geometry was implemented in the particle and heavy ion transport code systems. To accelerate the computational speed in the transport process, an original algorithm was introduced to initially prepare decomposition maps for the container box of the tetrahedral-mesh geometry. The computational performance was tested by conducting radiation transport simulations of 100 MeV protons and 1 MeV photons in a water phantom represented by tetrahedral mesh. The simulation was repeated with varying number of meshes and the required computational times were then compared with those of the conventional voxel representation. Our results show that the computational costs for each boundary crossing of the region mesh are essentially equivalent for both representations. This study suggests that the tetrahedral-mesh representation offers not only a flexible description of the transport geometry but also improvement of computational efficiency for the radiation transport. Due to the adaptability of tetrahedrons in both size and shape, dosimetrically equivalent objects can be represented by tetrahedrons with a much fewer number of meshes as compared its voxelized representation. Our study additionally included dosimetric calculations using a computational human phantom. A significant acceleration of the computational speed, about 4 times, was confirmed by the adoption of a tetrahedral mesh over the traditional voxel mesh geometry.

  18. Quantifying the influence of twin boundaries on the deformation of nanocrystalline copper using atomistic simulations

    DOE PAGES

    Tucker, Garritt J.; Foiles, Stephen Martin

    2014-09-22

    Over the past decade, numerous efforts have sought to understand the influence of twin boundaries on the behavior of polycrystalline materials. Early results suggested that twin boundaries within nanocrystalline face-centered cubic metals have a considerable effect on material behavior by altering the activated deformation mechanisms. In this work, we employ molecular dynamics simulations to elucidate the role of twin boundaries on the deformation of <100> columnar nanocrystalline copper at room temperature under uniaxial strain. We leverage non-local kinematic metrics, formulated from continuum mechanics theory, to compute atomically-resolved rotational and strain fields during plastic deformation. These results are then utilized tomore » compute the distribution of various nanoscale mechanisms during straining, and quantitatively resolve their contribution to the total strain accommodation within the microstructure, highlighting the fundamental role of twin boundaries. Our results show that nanoscale twins influence nanocrystalline copper by altering the cooperation of fundamental deformation mechanisms and their contributed role in strain accommodation, and we present new methods for extracting useful information from atomistic simulations. The simulation results suggest a tension–compression asymmetry in the distribution of deformation mechanisms and strain accommodation by either dislocations or twin boundary mechanisms. In highly twinned microstructures, twin boundary migration can become a significant deformation mode, in comparison to lattice dislocation plasticity in non-twinned columnar microstructures, especially during compression.« less

  19. Can one trust quantum simulators?

    PubMed

    Hauke, Philipp; Cucchietti, Fernando M; Tagliacozzo, Luca; Deutsch, Ivan; Lewenstein, Maciej

    2012-08-01

    Various fundamental phenomena of strongly correlated quantum systems such as high-T(c) superconductivity, the fractional quantum-Hall effect and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models which are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper (Feynman 1982 Int. J. Theor. Phys. 21 467), Richard Feynman suggested that such models might be solved by 'simulation' with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a 'quantum simulator,' would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question 'Can we trust quantum simulators?' is … to some extent.

  20. CARES/LIFE Software Commercialization

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.

  1. How far in-silico computing meets real experiments. A study on the structure and dynamics of spin labeled vinculin tail protein by molecular dynamics simulations and EPR spectroscopy

    PubMed Central

    2013-01-01

    Background Investigation of conformational changes in a protein is a prerequisite to understand its biological function. To explore these conformational changes in proteins we developed a strategy with the combination of molecular dynamics (MD) simulations and electron paramagnetic resonance (EPR) spectroscopy. The major goal of this work is to investigate how far computer simulations can meet the experiments. Methods Vinculin tail protein is chosen as a model system as conformational changes within the vinculin protein are believed to be important for its biological function at the sites of cell adhesion. MD simulations were performed on vinculin tail protein both in water and in vacuo environments. EPR experimental data is compared with those of the simulated data for corresponding spin label positions. Results The calculated EPR spectra from MD simulations trajectories of selected spin labelled positions are comparable to experimental EPR spectra. The results show that the information contained in the spin label mobility provides a powerful means of mapping protein folds and their conformational changes. Conclusions The results suggest the localization of dynamic and flexible regions of the vinculin tail protein. This study shows MD simulations can be used as a complementary tool to interpret experimental EPR data. PMID:23445506

  2. Properties of Organic Liquids when Simulated with Long-Range Lennard-Jones Interactions.

    PubMed

    Fischer, Nina M; van Maaren, Paul J; Ditz, Jonas C; Yildirim, Ahmet; van der Spoel, David

    2015-07-14

    In order to increase the accuracy of classical computer simulations, existing methodologies may need to be adapted. Hitherto, most force fields employ a truncated potential function to model van der Waals interactions, sometimes augmented with an analytical correction. Although such corrections are accurate for homogeneous systems with a long cutoff, they should not be used in inherently inhomogeneous systems such as biomolecular and interface systems. For such cases, a variant of the particle mesh Ewald algorithm (Lennard-Jones PME) was already proposed 20 years ago (Essmann et al. J. Chem. Phys. 1995, 103, 8577-8593), but it was implemented only recently (Wennberg et al. J. Chem. Theory Comput. 2013, 9, 3527-3537) in a major simulation code (GROMACS). The availability of this method allows surface tensions of liquids as well as bulk properties to be established, such as density and enthalpy of vaporization, without approximations due to truncation. Here, we report on simulations of ≈150 liquids (taken from a force field benchmark: Caleman et al. J. Chem. Theory Comput. 2012, 8, 61-74) using three different force fields and compare simulations with and without explicit long-range van der Waals interactions. We find that the density and enthalpy of vaporization increase for most liquids using the generalized Amber force field (GAFF, Wang et al. J. Comput. Chem. 2004, 25, 1157-1174) and the Charmm generalized force field (CGenFF, Vanommeslaeghe et al. J. Comput. Chem. 2010, 31, 671-690) but less so for OPLS/AA (Jorgensen and Tirado-Rives, Proc. Natl. Acad. Sci. U.S.A. 2005, 102, 6665-6670), which was parametrized with an analytical correction to the van der Waals potential. The surface tension increases by ≈10(-2) N/m for all force fields. These results suggest that van der Waals attractions in force fields are too strong, in particular for the GAFF and CGenFF. In addition to the simulation results, we introduce a new version of a web server, http://virtualchemistry.org, aimed at facilitating sharing and reuse of input files for molecular simulations.

  3. Using flight simulators aboard ships: human side effects of an optimal scenario with smooth seas.

    PubMed

    Muth, Eric R; Lawson, Ben

    2003-05-01

    The U.S. Navy is considering placing flight simulators aboard ships. It is known that certain types of flight simulators can elicit motion adaptation syndrome (MAS), and also that certain types of ship motion can cause MAS. The goal of this study was to determine if using a flight simulator during ship motion would cause MAS, even when the simulator stimulus and the ship motion were both very mild. All participants in this study completed three conditions. Condition 1 (Sim) entailed "flying" a personal computer-based flight simulator situated on land. Condition 2 (Ship) involved riding aboard a U.S. Navy Yard Patrol boat. Condition 3 (ShipSim) entailed "flying" a personal computer-based flight simulator while riding aboard a Yard Patrol boat. Before and after each condition, participants' balance and dynamic visual acuity were assessed. After each condition, participants filled out the Nausea Profile and the Simulator Sickness Questionnaire. Following exposure to a flight simulator aboard a ship, participants reported negligible symptoms of nausea and simulator sickness. However, participants exhibited a decrease in dynamic visual acuity after exposure to the flight simulator aboard ship (T[25] = 3.61, p < 0.05). Balance results were confounded by significant learning and, therefore, not interpretable. This study suggests that flight simulators can be used aboard ship. As a minimal safety precaution, these simulators should be used according to current safety practices for land-based simulators. Optimally, these simulators should be designed to minimize MAS, located near the ship's center of rotation and used when ship motion is not provocative.

  4. Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Y Churmakov, D.; Meglinski, I. V.; Piletsky, S. A.; Greenhalgh, D. A.

    2003-07-01

    A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an `effective' depth.

  5. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  6. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  7. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  8. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  9. Brief Report: Simulations Suggest Heterogeneous Category Learning and Generalization in Children with Autism is a Result of Idiosyncratic Perceptual Transformations.

    PubMed

    Mercado, Eduardo; Church, Barbara A

    2016-08-01

    Children with autism spectrum disorder (ASD) sometimes have difficulties learning categories. Past computational work suggests that such deficits may result from atypical representations in cortical maps. Here we use neural networks to show that idiosyncratic transformations of inputs can result in the formation of feature maps that impair category learning for some inputs, but not for other closely related inputs. These simulations suggest that large inter- and intra-individual variations in learning capacities shown by children with ASD across similar categorization tasks may similarly result from idiosyncratic perceptual encoding that is resistant to experience-dependent changes. If so, then both feedback- and exposure-based category learning should lead to heterogeneous, stimulus-dependent deficits in children with ASD.

  10. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  11. Flood Scenario Simulation and Disaster Estimation of Ba-Ma Creek Watershed in Nantou County, Taiwan

    NASA Astrophysics Data System (ADS)

    Peng, S. H.; Hsu, Y. K.

    2018-04-01

    The present study proposed several scenario simulations of flood disaster according to the historical flood event and planning requirement in Ba-Ma Creek Watershed located in Nantou County, Taiwan. The simulations were made using the FLO-2D model, a numerical model which can compute the velocity and depth of flood on a two-dimensional terrain. Meanwhile, the calculated data were utilized to estimate the possible damage incurred by the flood disaster. The results thus obtained can serve as references for disaster prevention. Moreover, the simulated results could be employed for flood disaster estimation using the method suggested by the Water Resources Agency of Taiwan. Finally, the conclusions and perspectives are presented.

  12. Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity

    PubMed Central

    Nessler, Bernhard; Pfeiffer, Michael; Buesing, Lars; Maass, Wolfgang

    2013-01-01

    The principles by which networks of neurons compute, and how spike-timing dependent plasticity (STDP) of synaptic weights generates and maintains their computational function, are unknown. Preceding work has shown that soft winner-take-all (WTA) circuits, where pyramidal neurons inhibit each other via interneurons, are a common motif of cortical microcircuits. We show through theoretical analysis and computer simulations that Bayesian computation is induced in these network motifs through STDP in combination with activity-dependent changes in the excitability of neurons. The fundamental components of this emergent Bayesian computation are priors that result from adaptation of neuronal excitability and implicit generative models for hidden causes that are created in the synaptic weights through STDP. In fact, a surprising result is that STDP is able to approximate a powerful principle for fitting such implicit generative models to high-dimensional spike inputs: Expectation Maximization. Our results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes. Furthermore it suggests networks of Bayesian computation modules as a new model for distributed information processing in the cortex. PMID:23633941

  13. The Simultaneous Production Model; A Model for the Construction, Testing, Implementation and Revision of Educational Computer Simulation Environments.

    ERIC Educational Resources Information Center

    Zillesen, Pieter G. van Schaick

    This paper introduces a hardware and software independent model for producing educational computer simulation environments. The model, which is based on the results of 32 studies of educational computer simulations program production, implies that educational computer simulation environments are specified, constructed, tested, implemented, and…

  14. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  15. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    NASA Astrophysics Data System (ADS)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  16. Towards Full Aircraft Airframe Noise Prediction: Detached Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Mineck, Raymond E.

    2014-01-01

    Results from a computational study on the aeroacoustic characteristics of an 18%-scale, semi-span Gulf-stream aircraft model are presented in this paper. NASA's FUN3D unstructured compressible Navier-Stokes solver was used to perform steady and unsteady simulations of the flow field associated with this high-fidelity aircraft model. Solutions were obtained for free-air at a Mach number of 0.2 with the flap deflected at 39 deg, with the main gear off and on (the two baseline configurations). Initially, the study focused on accurately predicting the prominent noise sources at both flap tips for the baseline configuration with deployed flap only. Building upon the experience gained from this initial effort, subsequent work involved the full landing configuration with both flap and main landing gear deployed. For the unsteady computations, we capitalized on the Detached Eddy Simulation capability of FUN3D to capture the complex time-dependent flow features associated with the flap and main gear. To resolve the noise sources over a broad frequency range, the tailored grid was very dense near the flap inboard and outboard tips and the region surrounding the gear. Extensive comparison of the computed steady and unsteady surface pressures with wind tunnel measurements showed good agreement for the global aerodynamic characteristics and the local flow field at the flap inboard tip. However, the computed pressure coefficients indicated that a zone of separated flow that forms in the vicinity of the outboard tip is larger in extent along the flap span and chord than measurements suggest. Computed farfield acoustic characteristics from a FW-H integral approach that used the simulated pressures on the model solid surface were in excellent agreement with corresponding measurements.

  17. A Framework for Image-Based Modeling of Acute Myocardial Ischemia Using Intramurally Recorded Extracellular Potentials.

    PubMed

    Burton, Brett M; Aras, Kedar K; Good, Wilson W; Tate, Jess D; Zenger, Brian; MacLeod, Rob S

    2018-05-21

    The biophysical basis for electrocardiographic evaluation of myocardial ischemia stems from the notion that ischemic tissues develop, with relative uniformity, along the endocardial aspects of the heart. These injured regions of subendocardial tissue give rise to intramural currents that lead to ST segment deflections within electrocardiogram (ECG) recordings. The concept of subendocardial ischemic regions is often used in clinical practice, providing a simple and intuitive description of ischemic injury; however, such a model grossly oversimplifies the presentation of ischemic disease-inadvertently leading to errors in ECG-based diagnoses. Furthermore, recent experimental studies have brought into question the subendocardial ischemia paradigm suggesting instead a more distributed pattern of tissue injury. These findings come from experiments and so have both the impact and the limitations of measurements from living organisms. Computer models have often been employed to overcome the constraints of experimental approaches and have a robust history in cardiac simulation. To this end, we have developed a computational simulation framework aimed at elucidating the effects of ischemia on measurable cardiac potentials. To validate our framework, we simulated, visualized, and analyzed 226 experimentally derived acute myocardial ischemic events. Simulation outcomes agreed both qualitatively (feature comparison) and quantitatively (correlation, average error, and significance) with experimentally obtained epicardial measurements, particularly under conditions of elevated ischemic stress. Our simulation framework introduces a novel approach to incorporating subject-specific, geometric models and experimental results that are highly resolved in space and time into computational models. We propose this framework as a means to advance the understanding of the underlying mechanisms of ischemic disease while simultaneously putting in place the computational infrastructure necessary to study and improve ischemia models aimed at reducing diagnostic errors in the clinic.

  18. Flame-Vortex Studies to Quantify Markstein Numbers Needed to Model Flame Extinction Limits

    NASA Technical Reports Server (NTRS)

    Driscoll, James F.; Feikema, Douglas A.

    2003-01-01

    This has quantified a database of Markstein numbers for unsteady flames; future work will quantify a database of flame extinction limits for unsteady conditions. Unsteady extinction limits have not been documented previously; both a stretch rate and a residence time must be measured, since extinction requires that the stretch rate be sufficiently large for a sufficiently long residence time. Ma was measured for an inwardly-propagating flame (IPF) that is negatively-stretched under microgravity conditions. Computations also were performed using RUN-1DL to explain the measurements. The Markstein number of an inwardly-propagating flame, for both the microgravity experiment and the computations, is significantly larger than that of an outwardy-propagating flame. The computed profiles of the various species within the flame suggest reasons. Computed hydrogen concentrations build up ahead of the IPF but not the OPF. Understanding was gained by running the computations for both simplified and full-chemistry conditions. Numerical Simulations. To explain the experimental findings, numerical simulations of both inwardly and outwardly propagating spherical flames (with complex chemistry) were generated using the RUN-1DL code, which includes 16 species and 46 reactions.

  19. Simulations Using Random-Generated DNA and RNA Sequences

    ERIC Educational Resources Information Center

    Bryce, C. F. A.

    1977-01-01

    Using a very simple computer program written in BASIC, a very large number of random-generated DNA or RNA sequences are obtained. Students use these sequences to predict complementary sequences and translational products, evaluate base compositions, determine frequencies of particular triplet codons, and suggest possible secondary structures.…

  20. Control and Stabilization: Making Millikan's Oil Drop Experiment Work

    ERIC Educational Resources Information Center

    Muller-Hill, Christoph; Heering, Peter

    2011-01-01

    Educational versions of Millikan's oil-drop experiment have frequently been criticized; suggestions for improvement either focus on technical innovations of the setup or on replacing the experiment by other approaches of familiarization, such as computer simulations. In our approach, we have analysed experimental procedures. In doing so, we were…

  1. Double layers without current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkins, F.W.; Sun, Y.C.

    1980-11-01

    The steady-state solution of the nonlinear Vlasov-Poisson equations is reduced to a nonlinear eigenvalue problem for the case of double-layer (potential drop) boundary conditions. Solutions with no relative electron-ion drifts are found. The kinetic stability is discussed. Suggestions for creating these states in experiments and computer simulations are offered.

  2. Comparison of Dam Breach Parameter Estimators

    DTIC Science & Technology

    2008-01-01

    of the methods, when used in the HEC - RAS simulation model , produced comparable results. The methods tested suggest use of ...characteristics of a dam breach, use of those parameters within the unsteady flow routing model HEC - RAS , and the computation and display of the resulting...implementation of these breach parameters in

  3. Density functional theory calculation of refractive indices of liquid-forming silicon oil compounds

    NASA Astrophysics Data System (ADS)

    Lee, Sanghun; Park, Sung Soo; Hagelberg, Frank

    2012-02-01

    A combination of quantum chemical calculation and molecular dynamics simulation is applied to compute refractive indices of liquid-forming silicon oils. The densities of these species are obtained from molecular dynamics simulations based on the NPT ensemble while the molecular polarizabilities are evaluated by density functional theory. This procedure is shown to yield results well compatible with available experimental data, suggesting that it represents a robust and economic route for determining the refractive indices of liquid-forming organic complexes containing silicon.

  4. Computing the binding affinity of Zn2+ in human carbonic anhydrase II on the basis of all-atom molecular dynamics simulations.

    NASA Astrophysics Data System (ADS)

    Wambo, Thierry; Rodriguez, Roberto

    Human carbonic anhydrase II (hCAII) is a metalloenzyme with a Zinc cation at its binding site. The presence of the Zinc turns the protein into an efficient enzyme which catalyzes the reversible hydration of carbon dioxide into bicarbonate anion. Available X-ray structures of the apo-hCAII and holo-hCAII show no significant differences in the overall structure of these proteins. What difference, if any, is there between the structures of the hydrated apo-hCAII and holo? How can we use computer simulation to efficiently compute the binding affinity of Zinc to hCAII? We will present a scheme developed to compute the binding affinity of Zinc cation to hCAII on the basis of all-atom molecular dynamics simulation where Zinc is represented as a point charge and the CHARMM36 force field is used for running the dynamics of the system. Our computed binding affinity of the cation to hCAII is in good agreement with experiment, within the margin of error, while a look at the dynamics of the binding site suggests that in the absence of the Zinc, there is a re-organization of the nearby histidine residues which adopt a new distinct configuration. The authors are thankful for the NIH support through Grants GM084834 and GM060655. They also acknowledge the Texas Advanced Computing Center at the University of Texas at Austin for the supercomputing time. They thank Dr Liao Chen for his comments.

  5. Comparative Evaluation of a Four-Implant-Supported Polyetherketoneketone Framework Prosthesis: A Three-Dimensional Finite Element Analysis Based on Cone Beam Computed Tomography and Computer-Aided Design.

    PubMed

    Lee, Ki-Sun; Shin, Sang-Wan; Lee, Sang-Pyo; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Jeong-Yol

    The purpose of this pilot study was to evaluate and compare polyetherketoneketone (PEKK) with different framework materials for implant-supported prostheses by means of a three-dimensional finite element analysis (3D-FEA) based on cone beam computed tomography (CBCT) and computer-aided design (CAD) data. A geometric model that consisted of four maxillary implants supporting a prosthesis framework was constructed from CBCT and CAD data of a treated patient. Three different materials (zirconia, titanium, and PEKK) were selected, and their material properties were simulated using FEA software in the generated geometric model. In the PEKK framework (ie, low elastic modulus) group, the stress transferred to the implant and simulated adjacent tissue was reduced when compressive stress was dominant, but increased when tensile stress was dominant. This study suggests that the shock-absorbing effects of a resilient implant-supported framework are limited in some areas and that rigid framework material shows a favorable stress distribution and safety of overall components of the prosthesis.

  6. Computer simulation of two-dimensional unsteady flows in estuaries and embayments by the method of characteristics : basic theory and the formulation of the numerical method

    USGS Publications Warehouse

    Lai, Chintu

    1977-01-01

    Two-dimensional unsteady flows of homogeneous density in estuaries and embayments can be described by hyperbolic, quasi-linear partial differential equations involving three dependent and three independent variables. A linear combination of these equations leads to a parametric equation of characteristic form, which consists of two parts: total differentiation along the bicharacteristics and partial differentiation in space. For its numerical solution, the specified-time-interval scheme has been used. The unknown, partial space-derivative terms can be eliminated first by suitable combinations of difference equations, converted from the corresponding differential forms and written along four selected bicharacteristics and a streamline. Other unknowns are thus made solvable from the known variables on the current time plane. The computation is carried to the second-order accuracy by using trapezoidal rule of integration. Means to handle complex boundary conditions are developed for practical application. Computer programs have been written and a mathematical model has been constructed for flow simulation. The favorable computer outputs suggest further exploration and development of model worthwhile. (Woodard-USGS)

  7. Micro-scale finite element modeling of ultrasound propagation in aluminum trabecular bone-mimicking phantoms: A comparison between numerical simulation and experimental results.

    PubMed

    Vafaeian, B; Le, L H; Tran, T N H T; El-Rich, M; El-Bialy, T; Adeeb, S

    2016-05-01

    The present study investigated the accuracy of micro-scale finite element modeling for simulating broadband ultrasound propagation in water-saturated trabecular bone-mimicking phantoms. To this end, five commercially manufactured aluminum foam samples as trabecular bone-mimicking phantoms were utilized for ultrasonic immersion through-transmission experiments. Based on micro-computed tomography images of the same physical samples, three-dimensional high-resolution computational samples were generated to be implemented in the micro-scale finite element models. The finite element models employed the standard Galerkin finite element method (FEM) in time domain to simulate the ultrasonic experiments. The numerical simulations did not include energy dissipative mechanisms of ultrasonic attenuation; however, they expectedly simulated reflection, refraction, scattering, and wave mode conversion. The accuracy of the finite element simulations were evaluated by comparing the simulated ultrasonic attenuation and velocity with the experimental data. The maximum and the average relative errors between the experimental and simulated attenuation coefficients in the frequency range of 0.6-1.4 MHz were 17% and 6% respectively. Moreover, the simulations closely predicted the time-of-flight based velocities and the phase velocities of ultrasound with maximum relative errors of 20 m/s and 11 m/s respectively. The results of this study strongly suggest that micro-scale finite element modeling can effectively simulate broadband ultrasound propagation in water-saturated trabecular bone-mimicking structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Estimation of electrical conductivity distribution within the human head from magnetic flux density measurement.

    PubMed

    Gao, Nuo; Zhu, S A; He, Bin

    2005-06-07

    We have developed a new algorithm for magnetic resonance electrical impedance tomography (MREIT), which uses only one component of the magnetic flux density to reconstruct the electrical conductivity distribution within the body. The radial basis function (RBF) network and simplex method are used in the present approach to estimate the conductivity distribution by minimizing the errors between the 'measured' and model-predicted magnetic flux densities. Computer simulations were conducted in a realistic-geometry head model to test the feasibility of the proposed approach. Single-variable and three-variable simulations were performed to estimate the brain-skull conductivity ratio and the conductivity values of the brain, skull and scalp layers. When SNR = 15 for magnetic flux density measurements with the target skull-to-brain conductivity ratio being 1/15, the relative error (RE) between the target and estimated conductivity was 0.0737 +/- 0.0746 in the single-variable simulations. In the three-variable simulations, the RE was 0.1676 +/- 0.0317. Effects of electrode position uncertainty were also assessed by computer simulations. The present promising results suggest the feasibility of estimating important conductivity values within the head from noninvasive magnetic flux density measurements.

  9. Comparison of optimization algorithms in intensity-modulated radiation therapy planning

    NASA Astrophysics Data System (ADS)

    Kendrick, Rachel

    Intensity-modulated radiation therapy is used to better conform the radiation dose to the target, which includes avoiding healthy tissue. Planning programs employ optimization methods to search for the best fluence of each photon beam, and therefore to create the best treatment plan. The Computational Environment for Radiotherapy Research (CERR), a program written in MATLAB, was used to examine some commonly-used algorithms for one 5-beam plan. Algorithms include the genetic algorithm, quadratic programming, pattern search, constrained nonlinear optimization, simulated annealing, the optimization method used in Varian EclipseTM, and some hybrids of these. Quadratic programing, simulated annealing, and a quadratic/simulated annealing hybrid were also separately compared using different prescription doses. The results of each dose-volume histogram as well as the visual dose color wash were used to compare the plans. CERR's built-in quadratic programming provided the best overall plan, but avoidance of the organ-at-risk was rivaled by other programs. Hybrids of quadratic programming with some of these algorithms seems to suggest the possibility of better planning programs, as shown by the improved quadratic/simulated annealing plan when compared to the simulated annealing algorithm alone. Further experimentation will be done to improve cost functions and computational time.

  10. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    PubMed

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to use high-fidelity patient simulators, which present simulations that are closer to real-life situations.

  11. Structure and dynamics of aqueous solutions from PBE-based first-principles molecular dynamics simulations.

    PubMed

    Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y; Schwegler, Eric

    2016-10-21

    Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. Such simulations are often performed at elevated temperatures to artificially "correct" for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. To address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na + , K + , and Cl - ions. We show that simulations at 390-400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. Our results suggest that an elevated temperature around 390-400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.

  12. Using Simulation to Examine the Effect of Physician Heterogeneity on the Operational Efficiency of an Overcrowded Hospital Emergency Department

    NASA Astrophysics Data System (ADS)

    Kuo, Y.-H.; Leung, J. M. Y.; Graham, C. A.

    2015-05-01

    In this paper, we present a case study of modelling and analyzing the patient flow of a hospital emergency department in Hong Kong. The emergency department is facing the challenge of overcrowding and the patients there usually experience a long waiting time. Our project team was requested by a senior consultant of the emergency department to analyze the patient flow and provide a decision support tool to help improve their operations. We adopt a simulation approach to mimic their daily operations. With the simulation model, we conduct a computational study to examine the effect of physician heterogeneity on the emergency department performance. We found that physician heterogeneity has a great impact on the operational efficiency and thus should be considered when developing simulation models. Our computational results show that, with the same average of service rates among the physicians, variation in the rates can improve overcrowding situation. This suggests that emergency departments may consider having some efficient physicians to speed up the overall service rate in return for more time for patients who need extra medical care.

  13. The nature of undergraduates' conceptual understanding of oxygen transport and utilization in humans: Can cardiopulmonary simulation software enhance learning of propositional knowledge and/or diagnose alternative conceptions in novices and intermediates?

    NASA Astrophysics Data System (ADS)

    Wissing, Dennis Robert

    The purpose of the this research was to explore undergraduates' conceptual development for oxygen transport and utilization, as a component of a cardiopulmonary physiology and advanced respiratory care course in the allied health program. This exploration focused on the student's development of knowledge and the presence of alternative conceptions, prior to, during, and after completing cardiopulmonary physiology and advanced respiratory care courses. Using the simulation program, SimBioSysTM (Samsel, 1994), student-participants completed a series of laboratory exercises focusing on cardiopulmonary disease states. This study examined data gathered from: (1) a novice group receiving the simulation program prior to instruction, (2) a novice group that experienced the simulation program following course completion in cardiopulmonary physiology, and (3) an intermediate group who experienced the simulation program following completion of formal education in Respiratory Care. This research was based on the theory of Human Constructivism as described by Mintzes, Wandersee, and Novak (1997). Data-gathering techniques were based on theories supported by Novak (1984), Wandersee (1997), and Chi (1997). Data were generated by exams, interviews, verbal analysis (Chi, 1997), and concept mapping. Results suggest that simulation may be an effective instructional method for assessing conceptual development and diagnosing alternative conceptions in undergraduates enrolled in a cardiopulmonary science program. Use of simulation in conjunction with clinical interview and concept mapping may assist in verifying gaps in learning and conceptual knowledge. This study found only limited evidence to support the use of computer simulation prior to lecture to augment learning. However, it was demonstrated that students' prelecture experience with the computer simulation helped the instructor assess what the learner knew so he or she could be taught accordingly. In addition, use of computer simulation after formal instruction was shown to be useful in aiding students identified by the instructor as needing remediation.

  14. Exact and efficient simulation of concordant computation

    NASA Astrophysics Data System (ADS)

    Cable, Hugo; Browne, Daniel E.

    2015-11-01

    Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.

  15. Maintenance of ventricular fibrillation in heterogeneous ventricle.

    PubMed

    Arevalo, Hamenegild J; Trayanova, Natalia A

    2006-01-01

    Although ventricular fibrillation (VF) is the prevalent cause of sudden cardiac death, the mechanisms that underlie VF remain elusive. One possible explanation is that VF is driven by a single robust rotor that is the source of wavefronts that break-up due to functional heterogeneities. Previous 2D computer simulations have proposed that a heterogeneity in background potassium current (IK1) can serve as the substrate for the formation of mother rotor activity. This study incorporates IK1 heterogeneity between the left and right ventricle in a realistic 3D rabbit ventricle model to examine its effects on the organization of VF. Computer simulations show that the IK1 heterogeneity contributes to the initiation and maintenance of VF by providing regions of different refractoriness which serves as sites of wave break and rotor formation. A single rotor that drives the fibrillatory activity in the ventricle is not found in this study. Instead, multiple sites of reentry are recorded throughout the ventricle. Calculation of dominant frequencies for each myocardial node yields no significant difference between the dominant frequency of the LV and the RV. The 3D computer simulations suggest that IK1 spatial heterogeneity alone can not lead to the formation of a stable rotor.

  16. Identification of a Novel Class of BRD4 Inhibitors by Computational Screening and Binding Simulations

    PubMed Central

    2017-01-01

    Computational screening is a method to prioritize small-molecule compounds based on the structural and biochemical attributes built from ligand and target information. Previously, we have developed a scalable virtual screening workflow to identify novel multitarget kinase/bromodomain inhibitors. In the current study, we identified several novel N-[3-(2-oxo-pyrrolidinyl)phenyl]-benzenesulfonamide derivatives that scored highly in our ensemble docking protocol. We quantified the binding affinity of these compounds for BRD4(BD1) biochemically and generated cocrystal structures, which were deposited in the Protein Data Bank. As the docking poses obtained in the virtual screening pipeline did not align with the experimental cocrystal structures, we evaluated the predictions of their precise binding modes by performing molecular dynamics (MD) simulations. The MD simulations closely reproduced the experimentally observed protein–ligand cocrystal binding conformations and interactions for all compounds. These results suggest a computational workflow to generate experimental-quality protein–ligand binding models, overcoming limitations of docking results due to receptor flexibility and incomplete sampling, as a useful starting point for the structure-based lead optimization of novel BRD4(BD1) inhibitors. PMID:28884163

  17. Expert knowledge elicitation using computer simulation: the organization of frail elderly case management as an illustration.

    PubMed

    Chiêm, Jean-Christophe; Van Durme, Thérèse; Vandendorpe, Florence; Schmitz, Olivier; Speybroeck, Niko; Cès, Sophie; Macq, Jean

    2014-08-01

    Various elderly case management projects have been implemented in Belgium. This type of long-term health care intervention involves contextual factors and human interactions. These underlying complex mechanisms can be usefully informed with field experts' knowledge, which are hard to make explicit. However, computer simulation has been suggested as one possible method of overcoming the difficulty of articulating such elicited qualitative views. A simulation model of case management was designed using an agent-based methodology, based on the initial qualitative research material. Variables and rules of interaction were formulated into a simple conceptual framework. This model has been implemented and was used as a support for a structured discussion with experts in case management. The rigorous formulation provided by the agent-based methodology clarified the descriptions of the interventions and the problems encountered regarding: the diverse network topologies of health care actors in the project; the adaptation time required by the intervention; the communication between the health care actors; the institutional context; the organization of the care; and the role of the case manager and his or hers personal ability to interpret the informal demands of the frail older person. The simulation model should be seen primarily as a tool for thinking and learning. A number of insights were gained as part of a valuable cognitive process. Computer simulation supporting field experts' elicitation can lead to better-informed decisions in the organization of complex health care interventions. © 2013 John Wiley & Sons, Ltd.

  18. Computational Design of a Thermostable Mutant of Cocaine Esterase via Molecular Dynamics Simulations

    PubMed Central

    Huang, Xiaoqin; Gao, Daquan; Zhan, Chang-Guo

    2015-01-01

    Cocaine esterase (CocE) has been known as the most efficient native enzyme for metabolizing the naturally occurring cocaine. A major obstacle to the clinical application of CocE is the thermoinstability of native CocE with a half-life of only ~11 min at physiological temperature (37°C). It is highly desirable to develop a thermostable mutant of CocE for therapeutic treatment of cocaine overdose and addiction. To establish a structure-thermostability relationship, we carried out molecular dynamics (MD) simulations at 400 K on wild-type CocE and previously known thermostable mutants, demonstrating that the thermostability of the active form of the enzyme correlates with the fluctuation (characterized as the RMSD and RMSF of atomic positions) of the catalytic residues (Y44, S117, Y118, H287, and D259) in the simulated enzyme. In light of the structure-thermostability correlation, further computational modeling including MD simulations at 400 K predicted that the active site structure of the L169K mutant should be more thermostable. The prediction has been confirmed by wet experimental tests showing that the active form of the L169K mutant had a half-life of 570 min at 37°C, which is significantly longer than those of the wild-type and previously known thermostable mutants. The encouraging outcome suggests that the high-temperature MD simulations and the structure-thermostability may be considered as a valuable tool for computational design of thermostable mutants of an enzyme. PMID:21373712

  19. Computer simulations of disordering kinetics in irradiated intermetallic compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spaczer, M.; Caro, A.; Victoria, M.

    1994-11-01

    Molecular-dynamics computer simulations of collision cascades in intermetallic Cu[sub 3]Au, Ni[sub 3]Al, and NiAl have been performed to study the nature of the disordering processes in the collision cascade. The choice of these systems was suggested by the quite accurate description of the thermodynamic properties obtained using embedded-atom-type potentials. Since melting occurs in the core of the cascades, interesting effects appear as a result of the superposition of the loss (and subsequent recovery) of the crystalline order and the evolution of the chemical order, both processes being developed on different time scales. In our previous simulations on Ni[sub 3]Al andmore » Cu[sub 3]Au [T. Diaz de la Rubia, A. Caro, and M. Spaczer, Phys. Rev. B 47, 11 483 (1993)] we found a significant difference between the time evolution of the chemical short-range order (SRO) and the crystalline order in the cascade core for both alloys, namely the complete loss of the crystalline structure but only partial chemical disordering. Recent computer simulations in NiAl show the same phenomena. To understand these features we study the liquid phase of these three alloys and present simulation results concerning the dynamical melting of small samples, examining the atomic mobility, the relaxation time, and the saturation value of the chemical short-range order. An analytic model for the time evolution of the SRO is given.« less

  20. Analysis of the role of the Spitzenkörper in fungal morphogenesis by computer simulation of apical branching in Aspergillus niger

    PubMed Central

    Reynaga-Peña, Cristina G.; Gierz, Gerhard; Bartnicki-Garcia, Salomon

    1997-01-01

    High-resolution video microscopy, image analysis, and computer simulation were used to study the role of the Spitzenkörper (Spk) in apical branching of ramosa-1, a temperature-sensitive mutant of Aspergillus niger. A shift to the restrictive temperature led to a cytoplasmic contraction that destabilized the Spk, causing its disappearance. After a short transition period, new Spk appeared where the two incipient apical branches emerged. Changes in cell shape, growth rate, and Spk position were recorded and transferred to the fungus simulator program to test the hypothesis that the Spk functions as a vesicle supply center (VSC). The simulation faithfully duplicated the elongation of the main hypha and the two apical branches. Elongating hyphae exhibited the growth pattern described by the hyphoid equation. During the transition phase, when no Spk was visible, the growth pattern was nonhyphoid, with consecutive periods of isometric and asymmetric expansion; the apex became enlarged and blunt before the apical branches emerged. Video microscopy images suggested that the branch Spk were formed anew by gradual condensation of vesicle clouds. Simulation exercises where the VSC was split into two new VSCs failed to produce realistic shapes, thus supporting the notion that the branch Spk did not originate by division of the original Spk. The best computer simulation of apical branching morphogenesis included simulations of the ontogeny of branch Spk via condensation of vesicle clouds. This study supports the hypothesis that the Spk plays a major role in hyphal morphogenesis by operating as a VSC—i.e., by regulating the traffic of wall-building vesicles in the manner predicted by the hyphoid model. PMID:9256441

  1. Adaptive screening for depression--recalibration of an item bank for the assessment of depression in persons with mental and somatic diseases and evaluation in a simulated computer-adaptive test environment.

    PubMed

    Forkmann, Thomas; Kroehne, Ulf; Wirtz, Markus; Norra, Christine; Baumeister, Harald; Gauggel, Siegfried; Elhan, Atilla Halil; Tennant, Alan; Boecker, Maren

    2013-11-01

    This study conducted a simulation study for computer-adaptive testing based on the Aachen Depression Item Bank (ADIB), which was developed for the assessment of depression in persons with somatic diseases. Prior to computer-adaptive test simulation, the ADIB was newly calibrated. Recalibration was performed in a sample of 161 patients treated for a depressive syndrome, 103 patients from cardiology, and 103 patients from otorhinolaryngology (mean age 44.1, SD=14.0; 44.7% female) and was cross-validated in a sample of 117 patients undergoing rehabilitation for cardiac diseases (mean age 58.4, SD=10.5; 24.8% women). Unidimensionality of the itembank was checked and a Rasch analysis was performed that evaluated local dependency (LD), differential item functioning (DIF), item fit and reliability. CAT-simulation was conducted with the total sample and additional simulated data. Recalibration resulted in a strictly unidimensional item bank with 36 items, showing good Rasch model fit (item fit residuals<|2.5|) and no DIF or LD. CAT simulation revealed that 13 items on average were necessary to estimate depression in the range of -2 and +2 logits when terminating at SE≤0.32 and 4 items if using SE≤0.50. Receiver Operating Characteristics analysis showed that θ estimates based on the CAT algorithm have good criterion validity with regard to depression diagnoses (Area Under the Curve≥.78 for all cut-off criteria). The recalibration of the ADIB succeeded and the simulation studies conducted suggest that it has good screening performance in the samples investigated and that it may reasonably add to the improvement of depression assessment. © 2013.

  2. Can one trust quantum simulators?

    NASA Astrophysics Data System (ADS)

    Hauke, Philipp; Cucchietti, Fernando M.; Tagliacozzo, Luca; Deutsch, Ivan; Lewenstein, Maciej

    2012-08-01

    Various fundamental phenomena of strongly correlated quantum systems such as high-Tc superconductivity, the fractional quantum-Hall effect and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models which are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper (Feynman 1982 Int. J. Theor. Phys. 21 467), Richard Feynman suggested that such models might be solved by ‘simulation’ with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a ‘quantum simulator,’ would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question ‘Can we trust quantum simulators?’ is … to some extent.

  3. Using block pulse functions for seismic vibration semi-active control of structures with MR dampers

    NASA Astrophysics Data System (ADS)

    Rahimi Gendeshmin, Saeed; Davarnia, Daniel

    2018-03-01

    This article applied the idea of block pulse functions in the semi-active control of structures. The BP functions give effective tools to approximate complex problems. The applied control algorithm has a major effect on the performance of the controlled system and the requirements of the control devices. In control problems, it is important to devise an accurate analytical technique with less computational cost. It is proved that the BP functions are fundamental tools in approximation problems which have been applied in disparate areas in last decades. This study focuses on the employment of BP functions in control algorithm concerning reduction the computational cost. Magneto-rheological (MR) dampers are one of the well-known semi-active tools that can be used to control the response of civil Structures during earthquake. For validation purposes, numerical simulations of a 5-story shear building frame with MR dampers are presented. The results of suggested method were compared with results obtained by controlling the frame by the optimal control method based on linear quadratic regulator theory. It can be seen from simulation results that the suggested method can be helpful in reducing seismic structural responses. Besides, this method has acceptable accuracy and is in agreement with optimal control method with less computational costs.

  4. Real-time simulation of biological soft tissues: a PGD approach.

    PubMed

    Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F

    2013-05-01

    We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Real-time simulation of the TF30-P-3 turbofan engine using a hybrid computer

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Bruton, W. M.

    1974-01-01

    A real-time, hybrid-computer simulation of the TF30-P-3 turbofan engine was developed. The simulation was primarily analog in nature but used the digital portion of the hybrid computer to perform bivariate function generation associated with the performance of the engine's rotating components. FORTRAN listings and analog patching diagrams are provided. The hybrid simulation was controlled by a digital computer programmed to simulate the engine's standard hydromechanical control. Both steady-state and dynamic data obtained from the digitally controlled engine simulation are presented. Hybrid simulation data are compared with data obtained from a digital simulation provided by the engine manufacturer. The comparisons indicate that the real-time hybrid simulation adequately matches the baseline digital simulation.

  6. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  7. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  8. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  9. Reversible simulation of irreversible computation

    NASA Astrophysics Data System (ADS)

    Li, Ming; Tromp, John; Vitányi, Paul

    1998-09-01

    Computer computations are generally irreversible while the laws of physics are reversible. This mismatch is penalized by among other things generating excess thermic entropy in the computation. Computing performance has improved to the extent that efficiency degrades unless all algorithms are executed reversibly, for example by a universal reversible simulation of irreversible computations. All known reversible simulations are either space hungry or time hungry. The leanest method was proposed by Bennett and can be analyzed using a simple ‘reversible’ pebble game. The reachable reversible simulation instantaneous descriptions (pebble configurations) of such pebble games are characterized completely. As a corollary we obtain the reversible simulation by Bennett and, moreover, show that it is a space-optimal pebble game. We also introduce irreversible steps and give a theorem on the tradeoff between the number of allowed irreversible steps and the memory gain in the pebble game. In this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. The reversible simulation can be modified so that it is applicable also when the simulated computation time is unknown.

  10. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We therefore conclude that customization parameters must be set with reference to the optimized parameters of the corresponding irradiation technique in order to render them useful for achieving artifact-free MC simulation for use in computational experiments and clinical treatments.

  11. Development of a Computational Chemical Vapor Deposition Model: Applications to Indium Nitride and Dicyanovinylaniline

    NASA Technical Reports Server (NTRS)

    Cardelino, Carlos

    1999-01-01

    A computational chemical vapor deposition (CVD) model is presented, that couples chemical reaction mechanisms with fluid dynamic simulations for vapor deposition experiments. The chemical properties of the systems under investigation are evaluated using quantum, molecular and statistical mechanics models. The fluid dynamic computations are performed using the CFD-ACE program, which can simulate multispecies transport, heat and mass transfer, gas phase chemistry, chemistry of adsorbed species, pulsed reactant flow and variable gravity conditions. Two experimental setups are being studied, in order to fabricate films of: (a) indium nitride (InN) from the gas or surface phase reaction of trimethylindium and ammonia; and (b) 4-(1,1)dicyanovinyl-dimethylaminoaniline (DCVA) by vapor deposition. Modeling of these setups requires knowledge of three groups of properties: thermodynamic properties (heat capacity), transport properties (diffusion, viscosity, and thermal conductivity), and kinetic properties (rate constants for all possible elementary chemical reactions). These properties are evaluated using computational methods whenever experimental data is not available for the species or for the elementary reactions. The chemical vapor deposition model is applied to InN and DCVA. Several possible InN mechanisms are proposed and analyzed. The CVD model simulations of InN show that the deposition rate of InN is more efficient when pulsing chemistry is used under conditions of high pressure and microgravity. An analysis of the chemical properties of DCVA show that DCVA dimers may form under certain conditions of physical vapor transport. CVD simulations of the DCVA system suggest that deposition of the DCVA dimer may play a small role in the film and crystal growth processes.

  12. Metabolic flexibility of mitochondrial respiratory chain disorders predicted by computer modelling.

    PubMed

    Zieliński, Łukasz P; Smith, Anthony C; Smith, Alexander G; Robinson, Alan J

    2016-11-01

    Mitochondrial respiratory chain dysfunction causes a variety of life-threatening diseases affecting about 1 in 4300 adults. These diseases are genetically heterogeneous, but have the same outcome; reduced activity of mitochondrial respiratory chain complexes causing decreased ATP production and potentially toxic accumulation of metabolites. Severity and tissue specificity of these effects varies between patients by unknown mechanisms and treatment options are limited. So far most research has focused on the complexes themselves, and the impact on overall cellular metabolism is largely unclear. To illustrate how computer modelling can be used to better understand the potential impact of these disorders and inspire new research directions and treatments, we simulated them using a computer model of human cardiomyocyte mitochondrial metabolism containing over 300 characterised reactions and transport steps with experimental parameters taken from the literature. Overall, simulations were consistent with patient symptoms, supporting their biological and medical significance. These simulations predicted: complex I deficiencies could be compensated using multiple pathways; complex II deficiencies had less metabolic flexibility due to impacting both the TCA cycle and the respiratory chain; and complex III and IV deficiencies caused greatest decreases in ATP production with metabolic consequences that parallel hypoxia. Our study demonstrates how results from computer models can be compared to a clinical phenotype and used as a tool for hypothesis generation for subsequent experimental testing. These simulations can enhance understanding of dysfunctional mitochondrial metabolism and suggest new avenues for research into treatment of mitochondrial disease and other areas of mitochondrial dysfunction. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Simplified Models for Accelerated Structural Prediction of Conjugated Semiconducting Polymers

    DOE PAGES

    Henry, Michael M.; Jones, Matthew L.; Oosterhout, Stefan D.; ...

    2017-11-08

    We perform molecular dynamics simulations of poly(benzodithiophene-thienopyrrolodione) (BDT-TPD) oligomers in order to evaluate the accuracy with which unoptimized molecular models can predict experimentally characterized morphologies. The predicted morphologies are characterized using simulated grazing-incidence X-ray scattering (GIXS) and compared to the experimental scattering patterns. We find that approximating the aromatic rings in BDT-TPD with rigid bodies, rather than combinations of bond, angle, and dihedral constraints, results in 14% lower computational cost and provides nearly equivalent structural predictions compared to the flexible model case. The predicted glass transition temperature of BDT-TPD (410 +/- 32 K) is found to be in agreement withmore » experiments. Predicted morphologies demonstrate short-range structural order due to stacking of the chain backbones (p-p stacking around 3.9 A), and long-range spatial correlations due to the self-organization of backbone stacks into 'ribbons' (lamellar ordering around 20.9 A), representing the best-to-date computational predictions of structure of complex conjugated oligomers. We find that expensive simulated annealing schedules are not needed to predict experimental structures here, with instantaneous quenches providing nearly equivalent predictions at a fraction of the computational cost of annealing. We therefore suggest utilizing rigid bodies and fast cooling schedules for high-throughput screening studies of semiflexible polymers and oligomers to utilize their significant computational benefits where appropriate.« less

  14. Simplified Models for Accelerated Structural Prediction of Conjugated Semiconducting Polymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael M.; Jones, Matthew L.; Oosterhout, Stefan D.

    We perform molecular dynamics simulations of poly(benzodithiophene-thienopyrrolodione) (BDT-TPD) oligomers in order to evaluate the accuracy with which unoptimized molecular models can predict experimentally characterized morphologies. The predicted morphologies are characterized using simulated grazing-incidence X-ray scattering (GIXS) and compared to the experimental scattering patterns. We find that approximating the aromatic rings in BDT-TPD with rigid bodies, rather than combinations of bond, angle, and dihedral constraints, results in 14% lower computational cost and provides nearly equivalent structural predictions compared to the flexible model case. The predicted glass transition temperature of BDT-TPD (410 +/- 32 K) is found to be in agreement withmore » experiments. Predicted morphologies demonstrate short-range structural order due to stacking of the chain backbones (p-p stacking around 3.9 A), and long-range spatial correlations due to the self-organization of backbone stacks into 'ribbons' (lamellar ordering around 20.9 A), representing the best-to-date computational predictions of structure of complex conjugated oligomers. We find that expensive simulated annealing schedules are not needed to predict experimental structures here, with instantaneous quenches providing nearly equivalent predictions at a fraction of the computational cost of annealing. We therefore suggest utilizing rigid bodies and fast cooling schedules for high-throughput screening studies of semiflexible polymers and oligomers to utilize their significant computational benefits where appropriate.« less

  15. Design of Bioprosthetic Aortic Valves using biaxial test data.

    PubMed

    Dabiri, Y; Paulson, K; Tyberg, J; Ronsky, J; Ali, I; Di Martino, E; Narine, K

    2015-01-01

    Bioprosthetic Aortic Valves (BAVs) do not have the serious limitations of mechanical aortic valves in terms of thrombosis. However, the lifetime of BAVs is too short, often requiring repeated surgeries. The lifetime of BAVs might be improved by using computer simulations of the structural behavior of the leaflets. The goal of this study was to develop a numerical model applicable to the optimization of durability of BAVs. The constitutive equations were derived using biaxial tensile tests. Using a Fung model, stress and strain data were computed from biaxial test data. SolidWorks was used to develop the geometry of the leaflets, and ABAQUS finite element software package was used for finite element calculations. Results showed the model is consistent with experimental observations. Reaction forces computed by the model corresponded with experimental measurements when the biaxial test was simulated. As well, the location of maximum stresses corresponded to the locations of frequent tearing of BAV leaflets. Results suggest that BAV design can be optimized with respect to durability.

  16. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

    NASA Astrophysics Data System (ADS)

    Oprisan, Sorinel Adrian; Oprisan, Ana

    2005-03-01

    Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells — EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

  17. A computational model of oxygen delivery by hemoglobin-based oxygen carriers in three-dimensional microvascular networks.

    PubMed

    Tsoukias, Nikolaos M; Goldman, Daniel; Vadapalli, Arjun; Pittman, Roland N; Popel, Aleksander S

    2007-10-21

    A detailed computational model is developed to simulate oxygen transport from a three-dimensional (3D) microvascular network to the surrounding tissue in the presence of hemoglobin-based oxygen carriers. The model accounts for nonlinear O(2) consumption, myoglobin-facilitated diffusion and nonlinear oxyhemoglobin dissociation in the RBCs and plasma. It also includes a detailed description of intravascular resistance to O(2) transport and is capable of incorporating realistic 3D microvascular network geometries. Simulations in this study were performed using a computer-generated microvascular architecture that mimics morphometric parameters for the hamster cheek pouch retractor muscle. Theoretical results are presented next to corresponding experimental data. Phosphorescence quenching microscopy provided PO(2) measurements at the arteriolar and venular ends of capillaries in the hamster retractor muscle before and after isovolemic hemodilution with three different hemodilutents: a non-oxygen-carrying plasma expander and two hemoglobin solutions with different oxygen affinities. Sample results in a microvascular network show an enhancement of diffusive shunting between arterioles, venules and capillaries and a decrease in hemoglobin's effectiveness for tissue oxygenation when its affinity for O(2) is decreased. Model simulations suggest that microvascular network anatomy can affect the optimal hemoglobin affinity for reducing tissue hypoxia. O(2) transport simulations in realistic representations of microvascular networks should provide a theoretical framework for choosing optimal parameter values in the development of hemoglobin-based blood substitutes.

  18. Towards reducing impact-induced brain injury: lessons from a computational study of army and football helmet pads.

    PubMed

    Moss, William C; King, Michael J; Blackman, Eric G

    2014-01-01

    We use computational simulations to compare the impact response of different football and U.S. Army helmet pad materials. We conduct experiments to characterise the material response of different helmet pads. We simulate experimental helmet impact tests performed by the U.S. Army to validate our methods. We then simulate a cylindrical impactor striking different pads. The acceleration history of the impactor is used to calculate the head injury criterion for each pad. We conduct sensitivity studies exploring the effects of pad composition, geometry and material stiffness. We find that (1) the football pad materials do not outperform the currently used military pad material in militarily relevant impact scenarios; (2) optimal material properties for a pad depend on impact energy and (3) thicker pads perform better at all velocities. Although we considered only the isolated response of pad materials, not entire helmet systems, our analysis suggests that by using larger helmet shells with correspondingly thicker pads, impact-induced traumatic brain injury may be reduced.

  19. Sensitivity of gas filter correlation instrument to variations in optical balance. [computer program simulated the response of the GFCR to changing pollutant levels

    NASA Technical Reports Server (NTRS)

    Orr, H. D., III; Campbell, S. A.

    1975-01-01

    A computer program was used to simulate the response of the Gas Filter Correlation Radiometer (GFCR) to changing pollutant levels of CO, SO2, CH4, and NH3 in two model atmospheres. Positive and negative deviations of tau sub alpha of magnitudes 0.01, 0.1, and 1 percent were imposed upon the simulation and the resulting deviations in inferred concentrations were determined. For the CO, CH4, and the higher pressure cell of the NH3 channel, the deviations are less than + or - 12 percent for deviations in tau sub alpha of + or - 0.1 percent, but increase to significantly higher values for larger deviations. For the lower pressure cell of NH3 and for SO2, the deviations in inferred concentration begin to rise sharply between 0.01 and 0.1 percent deviation in tau sub alpha, suggesting that a tighter control on tau sub alpha may be required for these channels.

  20. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    NASA Technical Reports Server (NTRS)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  1. Implementation of Headtracking and 3D Stereo with Unity and VRPN for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Noyes, Matthew A.

    2013-01-01

    This paper explores low-cost hardware and software methods to provide depth cues traditionally absent in monocular displays. The use of a VRPN server in conjunction with a Microsoft Kinect and/or Nintendo Wiimote to provide head tracking information to a Unity application, and NVIDIA 3D Vision for retinal disparity support, is discussed. Methods are suggested to implement this technology with NASA's EDGE simulation graphics package, along with potential caveats. Finally, future applications of this technology to astronaut crew training, particularly when combined with an omnidirectional treadmill for virtual locomotion and NASA's ARGOS system for reduced gravity simulation, are discussed.

  2. Simulation of Foam Impact Effects on Components of the Space Shuttle Thermal Protection System. Chapter 7

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.; Park, Young-Keun

    2004-01-01

    A series of three dimensional simulations has been performed to investigate analytically the effect of insulating foam impacts on ceramic tile and reinforced carbon-carbon components of the Space Shuttle thermal protection system. The simulations employed a hybrid particle-finite element method and a parallel code developed for use in spacecraft design applications. The conclusions suggested by the numerical study are in general consistent with experiment. The results emphasize the need for additional material testing work on the dynamic mechanical response of thermal protection system materials, and additional impact experiments for use in validating computational models of impact effects.

  3. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  4. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  5. Model structure identification for wastewater treatment simulation based on computational fluid dynamics.

    PubMed

    Alex, J; Kolisch, G; Krause, K

    2002-01-01

    The objective of this presented project is to use the results of an CFD simulation to automatically, systematically and reliably generate an appropriate model structure for simulation of the biological processes using CSTR activated sludge compartments. Models and dynamic simulation have become important tools for research but also increasingly for the design and optimisation of wastewater treatment plants. Besides the biological models several cases are reported about the application of computational fluid dynamics ICFD) to wastewater treatment plants. One aim of the presented method to derive model structures from CFD results is to exclude the influence of empirical structure selection to the result of dynamic simulations studies of WWTPs. The second application of the approach developed is the analysis of badly performing treatment plants where the suspicion arises that bad flow behaviour such as short cut flows is part of the problem. The method suggested requires as the first step the calculation of fluid dynamics of the biological treatment step at different loading situations by use of 3-dimensional CFD simulation. The result of this information is used to generate a suitable model structure for conventional dynamic simulation of the treatment plant by use of a number of CSTR modules with a pattern of exchange flows between the tanks automatically. The method is explained in detail and the application to the WWTP Wuppertal Buchenhofen is presented.

  6. Computer Support of Operator Training: Constructing and Testing a Prototype of a CAL (Computer Aided Learning) Supported Simulation Environment.

    ERIC Educational Resources Information Center

    Zillesen, P. G. van Schaick; And Others

    Instructional feedback given to the learners during computer simulation sessions may be greatly improved by integrating educational computer simulation programs with hypermedia-based computer-assisted learning (CAL) materials. A prototype of a learning environment of this type called BRINE PURIFICATION was developed for use in corporate training…

  7. Population Simulation, AKA: Grahz, Rahbitz and Fawkzes

    NASA Technical Reports Server (NTRS)

    Bangert, Tyler R.

    2008-01-01

    In an effort to give students a more visceral experience of science and instill a deeper working knowledge of concepts, activities that utilize hands-on, laboratory and simulated experiences are recommended because these activities have a greater impact on student learning, especially for Native American students. Because it is not usually feasible to take large and/or multiple classes of high school science students into the field to count numbers of organisms of a particular species, especially over a long period of time and covering a large area of an environment, the population simulation presented in this paper was created to aid students in understanding population dynamics by working with a simulated environment, which can be done in the classroom. Students create an environment and populate the environment with imaginary species. Then, using a sequence of "rules" that allow organisms to eat, reproduce, move and age, students see how the population of a species changes over time. In particular, students practice collecting data, summarizing information, plotting graphs, and interpreting graphs for such information as carrying capacity, predator prey relationships, and how specific species factors impact population and the environment. Students draw conclusions from their results and suggest further research, which may involve changes in simulation parameters, prediction of outcomes, and testing predictions. The population Simulation has demonstrated success in the above student activities using a "board game" version of the population simulation. A computer version of the population simulation needs more testing, but preliminary runs are promising. A second - and more complicated - computer simulation will simulate the same things and will add simulated population genetics.

  8. Conifer ovulate cones accumulate pollen principally by simple impaction.

    PubMed

    Cresswell, James E; Henning, Kevin; Pennel, Christophe; Lahoubi, Mohamed; Patrick, Michael A; Young, Phillipe G; Tabor, Gavin R

    2007-11-13

    In many pine species (Family Pinaceae), ovulate cones structurally resemble a turbine, which has been widely interpreted as an adaptation for improving pollination by producing complex aerodynamic effects. We tested the turbine interpretation by quantifying patterns of pollen accumulation on ovulate cones in a wind tunnel and by using simulation models based on computational fluid dynamics. We used computer-aided design and computed tomography to create computational fluid dynamics model cones. We studied three species: Pinus radiata, Pinus sylvestris, and Cedrus libani. Irrespective of the approach or species studied, we found no evidence that turbine-like aerodynamics made a significant contribution to pollen accumulation, which instead occurred primarily by simple impaction. Consequently, we suggest alternative adaptive interpretations for the structure of ovulate cones.

  9. Conifer ovulate cones accumulate pollen principally by simple impaction

    PubMed Central

    Cresswell, James E.; Henning, Kevin; Pennel, Christophe; Lahoubi, Mohamed; Patrick, Michael A.; Young, Phillipe G.; Tabor, Gavin R.

    2007-01-01

    In many pine species (Family Pinaceae), ovulate cones structurally resemble a turbine, which has been widely interpreted as an adaptation for improving pollination by producing complex aerodynamic effects. We tested the turbine interpretation by quantifying patterns of pollen accumulation on ovulate cones in a wind tunnel and by using simulation models based on computational fluid dynamics. We used computer-aided design and computed tomography to create computational fluid dynamics model cones. We studied three species: Pinus radiata, Pinus sylvestris, and Cedrus libani. Irrespective of the approach or species studied, we found no evidence that turbine-like aerodynamics made a significant contribution to pollen accumulation, which instead occurred primarily by simple impaction. Consequently, we suggest alternative adaptive interpretations for the structure of ovulate cones. PMID:17986613

  10. PIC Simulations of Hypersonic Plasma Instabilities

    NASA Astrophysics Data System (ADS)

    Niehoff, D.; Ashour-Abdalla, M.; Niemann, C.; Decyk, V.; Schriver, D.; Clark, E.

    2013-12-01

    The plasma sheaths formed around hypersonic aircraft (Mach number, M > 10) are relatively unexplored and of interest today to both further the development of new technologies and solve long-standing engineering problems. Both laboratory experiments and analytical/numerical modeling are required to advance the understanding of these systems; it is advantageous to perform these tasks in tandem. There has already been some work done to study these plasmas by experiments that create a rapidly expanding plasma through ablation of a target with a laser. In combination with a preformed magnetic field, this configuration leads to a magnetic "bubble" formed behind the front as particles travel at about Mach 30 away from the target. Furthermore, the experiment was able to show the generation of fast electrons which could be due to instabilities on electron scales. To explore this, future experiments will have more accurate diagnostics capable of observing time- and length-scales below typical ion scales, but simulations are a useful tool to explore these plasma conditions theoretically. Particle in Cell (PIC) simulations are necessary when phenomena are expected to be observed at these scales, and also have the advantage of being fully kinetic with no fluid approximations. However, if the scales of the problem are not significantly below the ion scales, then the initialization of the PIC simulation must be very carefully engineered to avoid unnecessary computation and to select the minimum window where structures of interest can be studied. One method of doing this is to seed the simulation with either experiment or ion-scale simulation results. Previous experiments suggest that a useful configuration for studying hypersonic plasma configurations is a ring of particles rapidly expanding transverse to an external magnetic field, which has been simulated on the ion scale with an ion-hybrid code. This suggests that the PIC simulation should have an equivalent configuration; however, modeling a plasma expanding radially in every direction is computationally expensive. In order to reduce the computational expense, we use a radial density profile from the hybrid simulation results to seed a self-consistent PIC simulation in one direction (x), while creating a current in the direction (y) transverse to both the drift velocity and the magnetic field (z) to create the magnetic bubble observed in experiment. The simulation will be run in two spatial dimensions but retain three velocity dimensions, and the results will be used to explore the growth of micro-instabilities present in hypersonic plasmas in the high-density region as it moves through the simulation box. This will still require a significantly large box in order to compare with experiment, as the experiments are being performed over distances of 104 λDe and durations of 105 ωpe-1.

  11. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  12. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  13. Computer-aided Instructional System for Transmission Line Simulation.

    ERIC Educational Resources Information Center

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  14. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    NASA Astrophysics Data System (ADS)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  15. Studies with spike initiators - Linearization by noise allows continuous signal modulation in neural networks

    NASA Technical Reports Server (NTRS)

    Yu, Xiaolong; Lewis, Edwin R.

    1989-01-01

    It is shown that noise can be an important element in the translation of neuronal generator potentials (summed inputs) to neuronal spike trains (outputs), creating or expanding a range of amplitudes over which the spike rate is proportional to the generator potential amplitude. Noise converts the basically nonlinear operation of a spike initiator into a nearly linear modulation process. This linearization effect of noise is examined in a simple intuitive model of a static threshold and in a more realistic computer simulation of spike initiator based on the Hodgkin-Huxley (HH) model. The results are qualitatively similar; in each case larger noise amplitude results in a larger range of nearly linear modulation. The computer simulation of the HH model with noise shows linear and nonlinear features that were earlier observed in spike data obtained from the VIIIth nerve of the bullfrog. This suggests that these features can be explained in terms of spike initiator properties, and it also suggests that the HH model may be useful for representing basic spike initiator properties in vertebrates.

  16. Combining patient journey modelling and visual multi-agent computer simulation: a framework to improving knowledge translation in a healthcare environment.

    PubMed

    Curry, Joanne; Fitzgerald, Anneke; Prodan, Ante; Dadich, Ann; Sloan, Terry

    2014-01-01

    This article focuses on a framework that will investigate the integration of two disparate methodologies: patient journey modelling and visual multi-agent simulation, and its impact on the speed and quality of knowledge translation to healthcare stakeholders. Literature describes patient journey modelling and visual simulation as discrete activities. This paper suggests that their combination and their impact on translating knowledge to practitioners are greater than the sum of the two technologies. The test-bed is ambulatory care and the goal is to determine if this approach can improve health services delivery, workflow, and patient outcomes and satisfaction. The multidisciplinary research team is comprised of expertise in patient journey modelling, simulation, and knowledge translation.

  17. Simulation of Tip-Sample Interaction in the Atomic Force Microscope

    NASA Technical Reports Server (NTRS)

    Good, Brian S.; Banerjea, Amitava

    1994-01-01

    Recent simulations of the interaction between planar surfaces and model Atomic Force Microscope (AFM) tips have suggested that there are conditions under which the tip may become unstable and 'avalanche' toward the sample surface. Here we investigate via computer simulation the stability of a variety of model AFM tip configurations with respect to the avalanche transition for a number of fcc metals. We perform Monte-Carlo simulations at room temperature using the Equivalent Crystal Theory (ECT) of Smith and Banerjea. Results are compared with recent experimental results as well as with our earlier work on the avalanche of parallel planar surfaces. Our results on a model single-atom tip are in excellent agreement with recent experiments on tunneling through mechanically-controlled break junctions.

  18. Validation of computer simulation training for esophagogastroduodenoscopy: Pilot study.

    PubMed

    Sedlack, Robert E

    2007-08-01

    Little is known regarding the value of esophagogastroduodenoscopy (EGD) simulators in education. The purpose of the present paper was to validate the use of computer simulation in novice EGD training. In phase 1, expert endoscopists evaluated various aspects of simulation fidelity as compared to live endoscopy. Additionally, computer-recorded performance metrics were assessed by comparing the recorded scores from users of three different experience levels. In phase 2, the transfer of simulation-acquired skills to the clinical setting was assessed in a two-group, randomized pilot study. The setting was a large gastroenterology (GI) Fellowship training program; in phase 1, 21 subjects (seven expert, intermediate and novice endoscopist), made up the three experience groups. In phase 2, eight novice GI fellows were involved in the two-group, randomized portion of the study examining the transfer of simulation skills to the clinical setting. During the initial validation phase, each of the 21 subjects completed two standardized EDG scenarios on a computer simulator and their performance scores were recorded for seven parameters. Following this, staff participants completed a questionnaire evaluating various aspects of the simulator's fidelity. Finally, four novice GI fellows were randomly assigned to receive 6 h of simulator-augmented training (SAT group) in EGD prior to beginning 1 month of patient-based EGD training. The remaining fellows experienced 1 month of patient-based training alone (PBT group). Results of the seven measured performance parameters were compared between three groups of varying experience using a Wilcoxon ranked sum test. The staffs' simulator fidelity survey used a 7-point Likert scale (1, very unrealistic; 4, neutral; 7, very realistic) for each of the parameters examined. During the second phase of this study, supervising staff rated both SAT and PBT fellows' patient-based performance daily. Scoring in each skill was completed using a 7-point Likert scale (1, strongly disagree; 4, neutral; 7, strongly agree). Median scores were compared between groups using the Wilcoxon ranked sum test. Staff evaluations of fidelity found that only two of the parameters examined (anatomy and scope maneuverability) had a significant degree of realism. The remaining areas were felt to be limited in their fidelity. Of the computer-recorded performance scores, only the novice group could be reliably identified from the other two experience groups. In the clinical application phase, the median Patient Discomfort ratings were superior in the PBT group (6; interquartile range [IQR], 5-6) as compared to the SAT group (5; IQR, 4-6; P = 0.015). PBT fellows' ratings were also superior in Sedation, Patient Discomfort, Independence and Competence during various phases of the evaluation. At no point were SAT fellows rated higher than the PBT group in any of the parameters examined. This EGD simulator has limitations to the degree of fidelity and can differentiate only novice endoscopists from other levels of experience. Finally, skills learned during EGD simulation training do not appear to translate well into patient-based endoscopy skills. These findings suggest against a key element of validity for the use of this computer simulator in novice EGD training.

  19. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    USDA-ARS?s Scientific Manuscript database

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  20. A meta-analysis of outcomes from the use of computer-simulated experiments in science education

    NASA Astrophysics Data System (ADS)

    Lejeune, John Van

    The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.

  1. Computational Investigations of Trichoderma Reesei Cel7A Suggest New Routes for Enzyme Activity Improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckham, G. T.; Payne, C. M.; Bu, L.

    2012-01-01

    The Trichoderma reesei Family 7 cellulase (Cel7A) is a key industrial enzyme in the production of biofuels from lignocellulosic biomass. It is a multi-modular enzyme with a Family 1 carbohydrate-binding module, a flexible O-glycosylated linker, and a large catalytic domain. We have used simulation to elucidate new functions for the 3 sub-domains, which suggests new routes to increase the activity of this central enzyme. These findings include new roles for glycosylation, which we have shown can be used to tune the binding affinity. We have also examined the structures of the catalytically-active complex of Cel7A and its non-processive counterpart, Cel7B,more » engaged on cellulose, which suggests allosteric mechanisms involved in chain binding when these cellulases are complexed on cellulose. Our computational results also suggest that product inhibition varies significantly between Cel7A and Cel7B, and we offer a molecular-level explanation for this observation. Finally, we discuss simulations of the absolute and relative binding free energy of cellulose ligands and various mutations along the CD tunnel, which will affect processivity and the ability of Cel7A (and related enzymes) to digest cellulose. These results highlight new considerations in protein engineering for processive and non-processive cellulases for production of lignocellulosic biofuels.« less

  2. Survival outcomes after radiation therapy for stage III non-small-cell lung cancer after adoption of computed tomography-based simulation.

    PubMed

    Chen, Aileen B; Neville, Bridget A; Sher, David J; Chen, Kun; Schrag, Deborah

    2011-06-10

    Technical studies suggest that computed tomography (CT) -based simulation improves the therapeutic ratio for thoracic radiation therapy (TRT), although few studies have evaluated its use or impact on outcomes. We used the Surveillance, Epidemiology and End Results (SEER) -Medicare linked data to identify CT-based simulation for TRT among Medicare beneficiaries diagnosed with stage III non-small-cell lung cancer (NSCLC) between 2000 and 2005. Demographic and clinical factors associated with use of CT simulation were identified, and the impact of CT simulation on survival was analyzed by using Cox models and propensity score analysis. The proportion of patients treated with TRT who had CT simulation increased from 2.4% in 1994 to 34.0% in 2000 to 77.6% in 2005. Of the 5,540 patients treated with TRT from 2000 to 2005, 60.1% had CT simulation. Geographic variation was seen in rates of CT simulation, with lower rates in rural areas and in the South and West compared with those in the Northeast and Midwest. Patients treated with chemotherapy were more likely to have CT simulation (65.2% v 51.2%; adjusted odds ratio, 1.67; 95% CI, 1.48 to 1.88; P < .01), although there was no significant association between use of surgery and CT simulation. Controlling for demographic and clinical characteristics, CT simulation was associated with lower risk of death (adjusted hazard ratio, 0.77; 95% CI, 0.73 to 0.82; P < .01) compared with conventional simulation. CT-based simulation has been widely, although not uniformly, adopted for the treatment of stage III NSCLC and is associated with higher survival among patients receiving TRT.

  3. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  4. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  5. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    NASA Astrophysics Data System (ADS)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  6. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  7. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  8. An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation

    NASA Technical Reports Server (NTRS)

    Bartos, R. D.

    1993-01-01

    Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.

  9. Structure and dynamics of aqueous solutions from PBE-based first-principles molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y.

    Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. In such simulations we often perform them at elevated temperaturesmore » to artificially “correct” for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. In order to address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na +, K +, and Cl - ions. We show that simulations at 390–400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. These results suggest that an elevated temperature around 390–400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.« less

  10. Structure and dynamics of aqueous solutions from PBE-based first-principles molecular dynamics simulations

    DOE PAGES

    Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y.; ...

    2016-10-17

    Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. In such simulations we often perform them at elevated temperaturesmore » to artificially “correct” for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. In order to address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na +, K +, and Cl - ions. We show that simulations at 390–400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. These results suggest that an elevated temperature around 390–400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.« less

  11. Comparative Implementation of High Performance Computing for Power System Dynamic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Huang, Zhenyu; Diao, Ruisheng

    Dynamic simulation for transient stability assessment is one of the most important, but intensive, computations for power system planning and operation. Present commercial software is mainly designed for sequential computation to run a single simulation, which is very time consuming with a single processer. The application of High Performance Computing (HPC) to dynamic simulations is very promising in accelerating the computing process by parallelizing its kernel algorithms while maintaining the same level of computation accuracy. This paper describes the comparative implementation of four parallel dynamic simulation schemes in two state-of-the-art HPC environments: Message Passing Interface (MPI) and Open Multi-Processing (OpenMP).more » These implementations serve to match the application with dedicated multi-processor computing hardware and maximize the utilization and benefits of HPC during the development process.« less

  12. The effectiveness of interactive computer simulations on college engineering student conceptual understanding and problem-solving ability related to circular motion

    NASA Astrophysics Data System (ADS)

    Chien, Cheng-Chih

    In the past thirty years, the effectiveness of computer assisted learning was found varied by individual studies. Today, with drastic technical improvement, computers have been widely spread in schools and used in a variety of ways. In this study, a design model involving educational technology, pedagogy, and content domain is proposed for effective use of computers in learning. Computer simulation, constructivist and Vygotskian perspectives, and circular motion are the three elements of the specific Chain Model for instructional design. The goal of the physics course is to help students remove the ideas which are not consistent with the physics community and rebuild new knowledge. To achieve the learning goal, the strategies of using conceptual conflicts and using language to internalize specific tasks into mental functions were included. Computer simulations and accompanying worksheets were used to help students explore their own ideas and to generate questions for discussions. Using animated images to describe the dynamic processes involved in the circular motion may reduce the complexity and possible miscommunications resulting from verbal explanations. The effectiveness of the instructional material on student learning is evaluated. The results of problem solving activities show that students using computer simulations had significantly higher scores than students not using computer simulations. For conceptual understanding, on the pretest students in the non-simulation group had significantly higher score than students in the simulation group. There was no significant difference observed between the two groups in the posttest. The relations of gender, prior physics experience, and frequency of computer uses outside the course to student achievement were also studied. There were fewer female students than male students and fewer students using computer simulations than students not using computer simulations. These characteristics affect the statistical power for detecting differences. For the future research, more intervention of simulations may be introduced to explore the potential of computer simulation in helping students learning. A test for conceptual understanding with more problems and appropriate difficulty level may be needed.

  13. A Two-Step Method to Select Major Surge-Producing Extratropical Cyclones from a 10,000-Year Stochastic Catalog

    NASA Astrophysics Data System (ADS)

    Keshtpoor, M.; Carnacina, I.; Yablonsky, R. M.

    2016-12-01

    Extratropical cyclones (ETCs) are the primary driver of storm surge events along the UK and northwest mainland Europe coastlines. In an effort to evaluate the storm surge risk in coastal communities in this region, a stochastic catalog is developed by perturbing the historical storm seeds of European ETCs to account for 10,000 years of possible ETCs. Numerical simulation of the storm surge generated by the full 10,000-year stochastic catalog, however, is computationally expensive and may take several months to complete with available computational resources. A new statistical regression model is developed to select the major surge-generating events from the stochastic ETC catalog. This regression model is based on the maximum storm surge, obtained via numerical simulations using a calibrated version of the Delft3D-FM hydrodynamic model with a relatively coarse mesh, of 1750 historical ETC events that occurred over the past 38 years in Europe. These numerically-simulated surge values were regressed to the local sea level pressure and the U and V components of the wind field at the location of 196 tide gauge stations near the UK and northwest mainland Europe coastal areas. The regression model suggests that storm surge values in the area of interest are highly correlated to the U- and V-component of wind speed, as well as the sea level pressure. Based on these correlations, the regression model was then used to select surge-generating storms from the 10,000-year stochastic catalog. Results suggest that roughly 105,000 events out of 480,000 stochastic storms are surge-generating events and need to be considered for numerical simulation using a hydrodynamic model. The selected stochastic storms were then simulated in Delft3D-FM, and the final refinement of the storm population was performed based on return period analysis of the 1750 historical event simulations at each of the 196 tide gauges in preparation for Delft3D-FM fine mesh simulations.

  14. High performance transcription factor-DNA docking with GPU computing

    PubMed Central

    2012-01-01

    Background Protein-DNA docking is a very challenging problem in structural bioinformatics and has important implications in a number of applications, such as structure-based prediction of transcription factor binding sites and rational drug design. Protein-DNA docking is very computational demanding due to the high cost of energy calculation and the statistical nature of conformational sampling algorithms. More importantly, experiments show that the docking quality depends on the coverage of the conformational sampling space. It is therefore desirable to accelerate the computation of the docking algorithm, not only to reduce computing time, but also to improve docking quality. Methods In an attempt to accelerate the sampling process and to improve the docking performance, we developed a graphics processing unit (GPU)-based protein-DNA docking algorithm. The algorithm employs a potential-based energy function to describe the binding affinity of a protein-DNA pair, and integrates Monte-Carlo simulation and a simulated annealing method to search through the conformational space. Algorithmic techniques were developed to improve the computation efficiency and scalability on GPU-based high performance computing systems. Results The effectiveness of our approach is tested on a non-redundant set of 75 TF-DNA complexes and a newly developed TF-DNA docking benchmark. We demonstrated that the GPU-based docking algorithm can significantly accelerate the simulation process and thereby improving the chance of finding near-native TF-DNA complex structures. This study also suggests that further improvement in protein-DNA docking research would require efforts from two integral aspects: improvement in computation efficiency and energy function design. Conclusions We present a high performance computing approach for improving the prediction accuracy of protein-DNA docking. The GPU-based docking algorithm accelerates the search of the conformational space and thus increases the chance of finding more near-native structures. To the best of our knowledge, this is the first ad hoc effort of applying GPU or GPU clusters to the protein-DNA docking problem. PMID:22759575

  15. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  16. Computational neurobiology is a useful tool in translational neurology: the example of ataxia

    PubMed Central

    Brown, Sherry-Ann; McCullough, Louise D.; Loew, Leslie M.

    2014-01-01

    Hereditary ataxia, or motor incoordination, affects approximately 150,000 Americans and hundreds of thousands of individuals worldwide with onset from as early as mid-childhood. Affected individuals exhibit dysarthria, dysmetria, action tremor, and diadochokinesia. In this review, we consider an array of computational studies derived from experimental observations relevant to human neuropathology. A survey of related studies illustrates the impact of integrating clinical evidence with data from mouse models and computational simulations. Results from these studies may help explain findings in mice, and after extensive laboratory study, may ultimately be translated to ataxic individuals. This inquiry lays a foundation for using computation to understand neurobiochemical and electrophysiological pathophysiology of spinocerebellar ataxias and may contribute to development of therapeutics. The interdisciplinary analysis suggests that computational neurobiology can be an important tool for translational neurology. PMID:25653585

  17. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach wasmore » evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.« less

  18. Shock-Wave/Boundary-Layer Interactions in Hypersonic Low Density Flows

    NASA Technical Reports Server (NTRS)

    Moss, James N.; Olejniczak, Joseph

    2004-01-01

    Results of numerical simulations of Mach 10 air flow over a hollow cylinder-flare and a double-cone are presented where viscous effects are significant. The flow phenomena include shock-shock and shock- boundary-layer interactions with accompanying flow separation, recirculation, and reattachment. The purpose of this study is to promote an understanding of the fundamental gas dynamics resulting from such complex interactions and to clarify the requirements for meaningful simulations of such flows when using the direct simulation Monte Carlo (DSMC) method. Particular emphasis is placed on the sensitivity of computed results to grid resolution. Comparisons of the DSMC results for the hollow cylinder-flare (30 deg.) configuration are made with the results of experimental measurements conducted in the ONERA RSCh wind tunnel for heating, pressure, and the extent of separation. Agreement between computations and measurements for various quantities is good except that for pressure. For the same flow conditions, the double- cone geometry (25 deg.- 65 deg.) produces much stronger interactions, and these interactions are investigated numerically using both DSMC and Navier-Stokes codes. For the double-cone computations, a two orders of magnitude variation in free-stream density (with Reynolds numbers from 247 to 24,7 19) is investigated using both computational methods. For this range of flow conditions, the computational results are in qualitative agreement for the extent of separation with the DSMC method always predicting a smaller separation region. Results from the Navier-Stokes calculations suggest that the flow for the highest density double-cone case may be unsteady; however, the DSMC solution does not show evidence of unsteadiness.

  19. Researcher's guide to the NASA Ames Flight Simulator for Advanced Aircraft (FSAA)

    NASA Technical Reports Server (NTRS)

    Sinacori, J. B.; Stapleford, R. L.; Jewell, W. F.; Lehman, J. M.

    1977-01-01

    Performance, limitations, supporting software, and current checkout and operating procedures are presented for the flight simulator, in terms useful to the researcher who intends to use it. Suggestions to help the researcher prepare the experimental plan are also given. The FSAA's central computer, cockpit, and visual and motion systems are addressed individually but their interaction is considered as well. Data required, available options, user responsibilities, and occupancy procedures are given in a form that facilitates the initial communication required with the NASA operations' group.

  20. Computational composite mechanics for aerospace propulsion structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial fabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating (1) complex composite structural behavior in general and (2) specific aerospace propulsion structural components in particular.

  1. Computational composite mechanics for aerospace propulsion structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1987-01-01

    Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial frabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating: (1) complex composite structural behavior in general, and (2) specific aerospace propulsion structural components in particular.

  2. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  3. Kinetic Monte Carlo Simulation of Oxygen Diffusion in Ytterbium Disilicate

    NASA Astrophysics Data System (ADS)

    Good, Brian

    2015-03-01

    Ytterbium disilicate is of interest as a potential environmental barrier coating for aerospace applications, notably for use in next generation jet turbine engines. In such applications, the diffusion of oxygen and water vapor through these coatings is undesirable if high temperature corrosion is to be avoided. In an effort to understand the diffusion process in these materials, we have performed kinetic Monte Carlo simulations of vacancy-mediated oxygen diffusion in Ytterbium Disilicate. Oxygen vacancy site energies and diffusion barrier energies are computed using Density Functional Theory. We find that many potential diffusion paths involve large barrier energies, but some paths have barrier energies smaller than one electron volt. However, computed vacancy formation energies suggest that the intrinsic vacancy concentration is small in the pure material, with the result that the material is unlikely to exhibit significant oxygen permeability.

  4. Enterprise virtual private network (VPN) with dense wavelength division multiplexing (DWDM) design

    NASA Astrophysics Data System (ADS)

    Carranza, Aparicio

    An innovative computer simulation and modeling tool for metropolitan area optical data communication networks is presented. These models address the unique requirements of Virtual Private Networks for enterprise data centers, which may comprise a mixture of protocols including ESCON, FICON, Fibre Channel, Sysplex protocols (ETR, CLO, ISC); and other links interconnected over dark fiber using Dense Wavelength Division Multiplexing (DWDM). Our models have the capability of designing a network with minimal inputs; to compute optical link budgets; suggest alternative configurations; and also optimize the design based on user-defined performance metrics. The models make use of Time Division Multiplexing (TDM) wherever possible for lower data rate traffics. Simulation results for several configurations are presented and they have been validated by means of experiments conducted on the IBM enterprise network testbed in Poughkeepsie, N.Y.

  5. MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1

    DTIC Science & Technology

    1971-05-01

    A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air

  6. Computational simulation of progressive fracture in fiber composites

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.

  7. Using Microcomputers Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    1985-01-01

    Presents a discussion of how computer simulations are used in two undergraduate social science courses and a faculty computer literacy course on simulations and artificial intelligence. Includes a list of 60 simulations for use on mainframes and microcomputers. Entries include type of hardware required, publisher's address, and cost. Sample…

  8. Learning Oceanography from a Computer Simulation Compared with Direct Experience at Sea

    ERIC Educational Resources Information Center

    Winn, William; Stahr, Frederick; Sarason, Christian; Fruland, Ruth; Oppenheimer, Peter; Lee, Yen-Ling

    2006-01-01

    Considerable research has compared how students learn science from computer simulations with how they learn from "traditional" classes. Little research has compared how students learn science from computer simulations with how they learn from direct experience in the real environment on which the simulations are based. This study compared two…

  9. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  10. Developability assessment of clinical drug products with maximum absorbable doses.

    PubMed

    Ding, Xuan; Rose, John P; Van Gelder, Jan

    2012-05-10

    Maximum absorbable dose refers to the maximum amount of an orally administered drug that can be absorbed in the gastrointestinal tract. Maximum absorbable dose, or D(abs), has proved to be an important parameter for quantifying the absorption potential of drug candidates. The purpose of this work is to validate the use of D(abs) in a developability assessment context, and to establish appropriate protocol and interpretation criteria for this application. Three methods for calculating D(abs) were compared by assessing how well the methods predicted the absorption limit for a set of real clinical candidates. D(abs) was calculated for these clinical candidates by means of a simple equation and two computer simulation programs, GastroPlus and an program developed at Eli Lilly and Company. Results from single dose escalation studies in Phase I clinical trials were analyzed to identify the maximum absorbable doses for these compounds. Compared to the clinical results, the equation and both simulation programs provide conservative estimates of D(abs), but in general D(abs) from the computer simulations are more accurate, which may find obvious advantage for the simulations in developability assessment. Computer simulations also revealed the complex behavior associated with absorption saturation and suggested in most cases that the D(abs) limit is not likely to be achieved in a typical clinical dose range. On the basis of the validation findings, an approach is proposed for assessing absorption potential, and best practices are discussed for the use of D(abs) estimates to inform clinical formulation development strategies. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Momentum Distribution as a Fingerprint of Quantum Delocalization in Enzymatic Reactions: Open-Chain Path-Integral Simulations of Model Systems and the Hydride Transfer in Dihydrofolate Reductase.

    PubMed

    Engel, Hamutal; Doron, Dvir; Kohen, Amnon; Major, Dan Thomas

    2012-04-10

    The inclusion of nuclear quantum effects such as zero-point energy and tunneling is of great importance in studying condensed phase chemical reactions involving the transfer of protons, hydrogen atoms, and hydride ions. In the current work, we derive an efficient quantum simulation approach for the computation of the momentum distribution in condensed phase chemical reactions. The method is based on a quantum-classical approach wherein quantum and classical simulations are performed separately. The classical simulations use standard sampling techniques, whereas the quantum simulations employ an open polymer chain path integral formulation which is computed using an efficient Monte Carlo staging algorithm. The approach is validated by applying it to a one-dimensional harmonic oscillator and symmetric double-well potential. Subsequently, the method is applied to the dihydrofolate reductase (DHFR) catalyzed reduction of 7,8-dihydrofolate by nicotinamide adenine dinucleotide phosphate hydride (NADPH) to yield S-5,6,7,8-tetrahydrofolate and NADP(+). The key chemical step in the catalytic cycle of DHFR involves a stereospecific hydride transfer. In order to estimate the amount of quantum delocalization, we compute the position and momentum distributions for the transferring hydride ion in the reactant state (RS) and transition state (TS) using a recently developed hybrid semiempirical quantum mechanics-molecular mechanics potential energy surface. Additionally, we examine the effect of compression of the donor-acceptor distance (DAD) in the TS on the momentum distribution. The present results suggest differential quantum delocalization in the RS and TS, as well as reduced tunneling upon DAD compression.

  12. Understanding resonance graphs using Easy Java Simulations (EJS) and why we use EJS

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Lee, Tat Leong; Chew, Charles; Wong, Darren; Tan, Samuel

    2015-03-01

    This paper reports a computer model simulation created using Easy Java Simulation (EJS) for learners to visualize how the steady-state amplitude of a driven oscillating system varies with the frequency of the periodic driving force. The simulation shows (N = 100) identical spring-mass systems being subjected to (1) a periodic driving force of equal amplitude but different driving frequencies, and (2) different amounts of damping. The simulation aims to create a visually intuitive way of understanding how the series of amplitude versus driving frequency graphs are obtained by showing how the displacement of the system changes over time as it transits from the transient to the steady state. A suggested ‘how to use’ the model is added to help educators and students in their teaching and learning, where we explain the theoretical steady-state equation time conditions when the model begins to allow data recording of maximum amplitudes to closely match the theoretical equation, and the steps to collect different runs of the degree of damping. We also discuss two of the design features in our computer model: displaying the instantaneous oscillation together with the achieved steady-state amplitudes, and the explicit world view overlay with scientific representation with different degrees of damping runs. Three advantages of using EJS include: (1) open source codes and creative commons attribution licenses for scaling up of interactively engaging educational practices; (2) the models made can run on almost any device, including Android and iOS; and (3) it allows the redefinition of physics educational practices through computer modeling.

  13. Computational design of a thermostable mutant of cocaine esterase via molecular dynamics simulations.

    PubMed

    Huang, Xiaoqin; Gao, Daquan; Zhan, Chang-Guo

    2011-06-07

    Cocaine esterase (CocE) has been known as the most efficient native enzyme for metabolizing naturally occurring cocaine. A major obstacle to the clinical application of CocE is the thermoinstability of native CocE with a half-life of only ∼11 min at physiological temperature (37 °C). It is highly desirable to develop a thermostable mutant of CocE for therapeutic treatment of cocaine overdose and addiction. To establish a structure-thermostability relationship, we carried out molecular dynamics (MD) simulations at 400 K on wild-type CocE and previously known thermostable mutants, demonstrating that the thermostability of the active form of the enzyme correlates with the fluctuation (characterized as the root-mean square deviation and root-mean square fluctuation of atomic positions) of the catalytic residues (Y44, S117, Y118, H287, and D259) in the simulated enzyme. In light of the structure-thermostability correlation, further computational modelling including MD simulations at 400 K predicted that the active site structure of the L169K mutant should be more thermostable. The prediction has been confirmed by wet experimental tests showing that the active form of the L169K mutant had a half-life of 570 min at 37 °C, which is significantly longer than those of the wild-type and previously known thermostable mutants. The encouraging outcome suggests that the high-temperature MD simulations and the structure-thermostability relationship may be considered as a valuable tool for the computational design of thermostable mutants of an enzyme.

  14. Computational Models of Laryngeal Aerodynamics: Potentials and Numerical Costs.

    PubMed

    Sadeghi, Hossein; Kniesburges, Stefan; Kaltenbacher, Manfred; Schützenberger, Anne; Döllinger, Michael

    2018-02-07

    Human phonation is based on the interaction between tracheal airflow and laryngeal dynamics. This fluid-structure interaction is based on the energy exchange between airflow and vocal folds. Major challenges in analyzing the phonatory process in-vivo are the small dimensions and the poor accessibility of the region of interest. For improved analysis of the phonatory process, numerical simulations of the airflow and the vocal fold dynamics have been suggested. Even though most of the models reproduced the phonatory process fairly well, development of comprehensive larynx models is still a subject of research. In the context of clinical application, physiological accuracy and computational model efficiency are of great interest. In this study, a simple numerical larynx model is introduced that incorporates the laryngeal fluid flow. It is based on a synthetic experimental model with silicone vocal folds. The degree of realism was successively increased in separate computational models and each model was simulated for 10 oscillation cycles. Results show that relevant features of the laryngeal flow field, such as glottal jet deflection, develop even when applying rather simple static models with oscillating flow rates. Including further phonatory components such as vocal fold motion, mucosal wave propagation, and ventricular folds, the simulations show phonatory key features like intraglottal flow separation and increased flow rate in presence of ventricular folds. The simulation time on 100 CPU cores ranged between 25 and 290 hours, currently restricting clinical application of these models. Nevertheless, results show high potential of numerical simulations for better understanding of phonatory process. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  15. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues.

    PubMed

    Farisco, Michele; Kotaleski, Jeanette H; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain's operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs.

  16. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues

    PubMed Central

    Farisco, Michele; Kotaleski, Jeanette H.; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain’s operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs. PMID:29740372

  17. Skin hydration analysis by experiment and computer simulations and its implications for diapered skin.

    PubMed

    Saadatmand, M; Stone, K J; Vega, V N; Felter, S; Ventura, S; Kasting, G; Jaworska, J

    2017-11-01

    Experimental work on skin hydration is technologically challenging, and mostly limited to observations where environmental conditions are constant. In some cases, like diapered baby skin, such work is practically unfeasible, yet it is important to understand potential effects of diapering on skin condition. To overcome this challenge, in part, we developed a computer simulation model of reversible transient skin hydration effects. Skin hydration model by Li et al. (Chem Eng Sci, 138, 2015, 164) was further developed to simulate transient exposure conditions where relative humidity (RH), wind velocity, air, and skin temperature can be any function of time. Computer simulations of evaporative water loss (EWL) decay after different occlusion times were compared with experimental data to calibrate the model. Next, we used the model to investigate EWL and SC thickness in different diapering scenarios. Key results from the experimental work were: (1) For occlusions by RH=100% and free water longer than 30 minutes the absorbed amount of water is almost the same; (2) Longer occlusion times result in higher water absorption by the SC. The EWL decay and skin water content predictions were in agreement with experimental data. Simulations also revealed that skin under occlusion hydrates mainly because the outflux is blocked, not because it absorbs water from the environment. Further, simulations demonstrated that hydration level is sensitive to time, RH and/or free water on skin. In simulated diapering scenarios, skin maintained hydration content very close to the baseline conditions without a diaper for the entire duration of a 24 hours period. Different diapers/diaper technologies are known to have different profiles in terms of their ability to provide wetness protection, which can result in consumer-noticeable differences in wetness. Simulation results based on published literature using data from a number of different diapers suggest that diapered skin hydrates within ranges considered reversible. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. The use of computer simulations in whole-class versus small-group settings

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen

    This study explored the use of computer simulations in a whole-class as compared to small-group setting. Specific consideration was given to the nature and impact of classroom conversations and interactions when computer simulations were incorporated into a high school chemistry course. This investigation fills a need for qualitative research that focuses on the social dimensions of actual classrooms. Participants included a novice chemistry teacher experienced in the use of educational technologies and two honors chemistry classes. The study was conducted in a rural school in the south-Atlantic United States at the end of the fall 2007 semester. The study took place during one instructional unit on atomic structure. Data collection allowed for triangulation of evidence from a variety of sources approximately 24 hours of video- and audio-taped classroom observations, supplemented with the researcher's field notes and analytic journal; miscellaneous classroom artifacts such as class notes, worksheets, and assignments; open-ended pre- and post-assessments; student exit interviews; teacher entrance, exit and informal interviews. Four web-based simulations were used, three of which were from the ExploreLearning collection. Assessments were analyzed using descriptive statistics and classroom observations, artifacts and interviews were analyzed using Erickson's (1986) guidelines for analytic induction. Conversational analysis was guided by methods outlined by Erickson (1982). Findings indicated (a) the teacher effectively incorporated simulations in both settings (b) students in both groups significantly improved their understanding of the chemistry concepts (c) there was no statistically significant difference between groups' achievement (d) there was more frequent exploratory talk in the whole-class group (e) there were more frequent and meaningful teacher-student interactions in the whole-class group (f) additional learning experiences not measured on the assessment resulted from conversations and interactions in the whole-class setting (g) the potential benefits of exploratory talk in the whole-class setting were not fully realized. These findings suggest that both whole-class and small-group settings are appropriate for using computer simulations in science. The effective incorporation of simulations into whole-class instruction may provide a solution to the dilemma of technology penetration versus integration in today's classrooms.

  19. A generalized methodology for identification of threshold for HRU delineation in SWAT model

    NASA Astrophysics Data System (ADS)

    M, J.; Sudheer, K.; Chaubey, I.; Raj, C.

    2016-12-01

    The distributed hydrological model, Soil and Water Assessment Tool (SWAT) is a comprehensive hydrologic model widely used for making various decisions. The simulation accuracy of the distributed hydrological model differs due to the mechanism involved in the subdivision of the watershed. Soil and Water Assessment Tool (SWAT) considers sub-dividing the watershed and the sub-basins into small computing units known as 'hydrologic response units (HRU). The delineation of HRU is done based on unique combinations of land use, soil types, and slope within the sub-watersheds, which are not spatially defined. The computations in SWAT are done at HRU level and are then aggregated up to the sub-basin outlet, which is routed through the stream system. Generally, the HRUs are delineated by considering a threshold percentage of land use, soil and slope are to be given by the modeler to decrease the computation time of the model. The thresholds constrain the minimum area for constructing an HRU. In the current HRU delineation practice in SWAT, the land use, soil and slope of the watershed within a sub-basin, which is less than the predefined threshold, will be surpassed by the dominating land use, soil and slope, and introduce some level of ambiguity in the process simulations in terms of inappropriate representation of the area. But the loss of information due to variation in the threshold values depends highly on the purpose of the study. Therefore this research studies the effects of threshold values of HRU delineation on the hydrological modeling of SWAT on sediment simulations and suggests guidelines for selecting the appropriate threshold values considering the sediment simulation accuracy. The preliminary study was done on Illinois watershed by assigning different thresholds for land use and soil. A general methodology was proposed for identifying an appropriate threshold for HRU delineation in SWAT model that considered computational time and accuracy of the simulation. The methodology can be adopted for identifying an appropriate threshold for SWAT model simulation in any watershed with a single simulation of the model with a zero-zero threshold.

  20. Macrosegregation Resulting from Directional Solidification Through an Abrupt Change in Cross-Sections

    NASA Technical Reports Server (NTRS)

    Lauer, M.; Poirier, D. R.; Ghods, M.; Tewari, S. N.; Grugel, R. N.

    2017-01-01

    Simulations of the directional solidification of two hypoeutectic alloys (Al-7Si alloy and Al-19Cu) and resulting macrosegregation patterns are presented. The casting geometries include abrupt changes in cross-section from a larger width of 9.5 mm to a narrower 3.2 mm width then through an expansion back to a width of 9.5 mm. The alloys were chosen as model alloys because they have similar solidification shrinkages, but the effect of Cu on changing the density of the liquid alloy is about an order of magnitude greater than that of Si. The simulations compare well with experimental castings that were directionally solidified in a graphite mold in a Bridgman furnace. In addition to the simulations of the directional solidification in graphite molds, some simulations were effected for solidification in an alumina mold. This study showed that the mold must be included in numerical simulations of directional solidification because of its effect on the temperature field and solidification. For the model alloys used for the study, the simulations clearly show the interaction of the convection field with the solidifying alloys to produce a macrosegregation pattern known as "steepling" in sections with a uniform width. Details of the complex convection- and segregation-patterns at both the contraction and expansion of the cross-sectional area are revealed by the computer simulations. The convection and solidification through the expansions suggest a possible mechanism for the formation of stray grains. The computer simulations and the experimental castings have been part of on-going ground-based research with the goal of providing necessary background for eventual experiments aboard the ISS. For casting practitioners, the results of the simulations demonstrate that computer simulations should be applied to reveal interactions between alloy solidification properties, solidification conditions, and mold geometries on macrosegregation. The simulations also presents the possibility of engineering the mold-material to avoid, or mitigate, the effects of thermosolutal convection and macrosegregation by selecting a mold material with suitable thermal properties, especially its thermal conductivity.

  1. Computational and experimental analysis of short peptide motifs for enzyme inhibition.

    PubMed

    Fu, Jinglin; Larini, Luca; Cooper, Anthony J; Whittaker, John W; Ahmed, Azka; Dong, Junhao; Lee, Minyoung; Zhang, Ting

    2017-01-01

    The metabolism of living systems involves many enzymes that play key roles as catalysts and are essential to biological function. Searching ligands with the ability to modulate enzyme activities is central to diagnosis and therapeutics. Peptides represent a promising class of potential enzyme modulators due to the large chemical diversity, and well-established methods for library synthesis. Peptides and their derivatives are found to play critical roles in modulating enzymes and mediating cellular uptakes, which are increasingly valuable in therapeutics. We present a methodology that uses molecular dynamics (MD) and point-variant screening to identify short peptide motifs that are critical for inhibiting β-galactosidase (β-Gal). MD was used to simulate the conformations of peptides and to suggest short motifs that were most populated in simulated conformations. The function of the simulated motifs was further validated by the experimental point-variant screening as critical segments for inhibiting the enzyme. Based on the validated motifs, we eventually identified a 7-mer short peptide for inhibiting an enzyme with low μM IC50. The advantage of our methodology is the relatively simplified simulation that is informative enough to identify the critical sequence of a peptide inhibitor, with a precision comparable to truncation and alanine scanning experiments. Our combined experimental and computational approach does not rely on a detailed understanding of mechanistic and structural details. The MD simulation suggests the populated motifs that are consistent with the results of the experimental alanine and truncation scanning. This approach appears to be applicable to both natural and artificial peptides. With more discovered short motifs in the future, they could be exploited for modulating biocatalysis, and developing new medicine.

  2. Computer-based simulation training in emergency medicine designed in the light of malpractice cases.

    PubMed

    Karakuş, Akan; Duran, Latif; Yavuz, Yücel; Altintop, Levent; Calişkan, Fatih

    2014-07-27

    Using computer-based simulation systems in medical education is becoming more and more common. Although the benefits of practicing with these systems in medical education have been demonstrated, advantages of using computer-based simulation in emergency medicine education are less validated. The aim of the present study was to assess the success rates of final year medical students in doing emergency medical treatment and evaluating the effectiveness of computer-based simulation training in improving final year medical students' knowledge. Twenty four Students trained with computer-based simulation and completed at least 4 hours of simulation-based education between the dates Feb 1, 2010 - May 1, 2010. Also a control group (traditionally trained, n =24) was chosen. After the end of training, students completed an examination about 5 randomized medical simulation cases. In 5 cases, an average of 3.9 correct medical approaches carried out by computer-based simulation trained students, an average of 2.8 correct medical approaches carried out by traditionally trained group (t = 3.90, p < 0.005). We found that the success of students trained with simulation training in cases which required complicated medical approach, was statistically higher than the ones who didn't take simulation training (p ≤ 0.05). Computer-based simulation training would be significantly effective in learning of medical treatment algorithms. We thought that these programs can improve the success rate of students especially in doing adequate medical approach to complex emergency cases.

  3. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  4. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  5. From biological neural networks to thinking machines: Transitioning biological organizational principles to computer technology

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    The three-dimensional organization of the vestibular macula is under study by computer assisted reconstruction and simulation methods as a model for more complex neural systems. One goal of this research is to transition knowledge of biological neural network architecture and functioning to computer technology, to contribute to the development of thinking computers. Maculas are organized as weighted neural networks for parallel distributed processing of information. The network is characterized by non-linearity of its terminal/receptive fields. Wiring appears to develop through constrained randomness. A further property is the presence of two main circuits, highly channeled and distributed modifying, that are connected through feedforward-feedback collaterals and biasing subcircuit. Computer simulations demonstrate that differences in geometry of the feedback (afferent) collaterals affects the timing and the magnitude of voltage changes delivered to the spike initiation zone. Feedforward (efferent) collaterals act as voltage followers and likely inhibit neurons of the distributed modifying circuit. These results illustrate the importance of feedforward-feedback loops, of timing, and of inhibition in refining neural network output. They also suggest that it is the distributed modifying network that is most involved in adaptation, memory, and learning. Tests of macular adaptation, through hyper- and microgravitational studies, support this hypothesis since synapses in the distributed modifying circuit, but not the channeled circuit, are altered. Transitioning knowledge of biological systems to computer technology, however, remains problematical.

  6. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  7. A Computer-Based Simulation of an Acid-Base Titration

    ERIC Educational Resources Information Center

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  8. Software for Brain Network Simulations: A Comparative Study

    PubMed Central

    Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.

    2017-01-01

    Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package. The results indicate that BRIAN provides the most concise language for both cases considered. Furthermore, as expected, NEST mostly favors large network models, while NEURON is better suited for detailed models. Overall, the case studies reinforce our general observation that simulators have a bias in the computational performance toward specific types of the brain network models. PMID:28775687

  9. Bounding the electrostatic free energies associated with linear continuum models of molecular solvation.

    PubMed

    Bardhan, Jaydeep P; Knepley, Matthew G; Anitescu, Mihai

    2009-03-14

    The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.

  10. Bounding the electrostatic free energies associated with linear continuum models of molecular solvation

    NASA Astrophysics Data System (ADS)

    Bardhan, Jaydeep P.; Knepley, Matthew G.; Anitescu, Mihai

    2009-03-01

    The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.

  11. Computational knee ligament modeling using experimentally determined zero-load lengths.

    PubMed

    Bloemker, Katherine H; Guess, Trent M; Maletsky, Lorin; Dodd, Kevin

    2012-01-01

    This study presents a subject-specific method of determining the zero-load lengths of the cruciate and collateral ligaments in computational knee modeling. Three cadaver knees were tested in a dynamic knee simulator. The cadaver knees also underwent manual envelope of motion testing to find their passive range of motion in order to determine the zero-load lengths for each ligament bundle. Computational multibody knee models were created for each knee and model kinematics were compared to experimental kinematics for a simulated walk cycle. One-dimensional non-linear spring damper elements were used to represent cruciate and collateral ligament bundles in the knee models. This study found that knee kinematics were highly sensitive to altering of the zero-load length. The results also suggest optimal methods for defining each of the ligament bundle zero-load lengths, regardless of the subject. These results verify the importance of the zero-load length when modeling the knee joint and verify that manual envelope of motion measurements can be used to determine the passive range of motion of the knee joint. It is also believed that the method described here for determining zero-load length can be used for in vitro or in vivo subject-specific computational models.

  12. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G}more » for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient parallel computation of particle and fluid dynamics simulations. These problems occur throughout DOE, military and commercial sectors: the potential payoff is high. We plan to license or sell the solution to contractors for military and domestic applications such as disaster simulation (aerodynamic and hydrodynamic), Government agencies (hydrological and environmental simulations), and medical applications (e.g., in tomographic image reconstruction). Keywords - High-performance Computing, Graphic Processing Unit, Fluid/Particle Simulation. Summary for Members of Congress - Department of Energy has many simulation codes that must compute faster, to be effective. The Phase I research parallelized particle/fluid simulations for rocket combustion, for high-performance computing systems.« less

  13. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less

  14. Molecular determinants for the thermodynamic and functional divergence of uniporter GLUT1 and proton symporter XylE

    PubMed Central

    Ke, Meng; Jiang, Xin; Yan, Nieng

    2017-01-01

    GLUT1 facilitates the down-gradient translocation of D-glucose across cell membrane in mammals. XylE, an Escherichia coli homolog of GLUT1, utilizes proton gradient as an energy source to drive uphill D-xylose transport. Previous studies of XylE and GLUT1 suggest that the variation between an acidic residue (Asp27 in XylE) and a neutral one (Asn29 in GLUT1) is a key element for their mechanistic divergence. In this work, we combined computational and biochemical approaches to investigate the mechanism of proton coupling by XylE and the functional divergence between GLUT1 and XylE. Using molecular dynamics simulations, we evaluated the free energy profiles of the transition between inward- and outward-facing conformations for the apo proteins. Our results revealed the correlation between the protonation state and conformational preference in XylE, which is supported by the crystal structures. In addition, our simulations suggested a thermodynamic difference between XylE and GLUT1 that cannot be explained by the single residue variation at the protonation site. To understand the molecular basis, we applied Bayesian network models to analyze the alteration in the architecture of the hydrogen bond networks during conformational transition. The models and subsequent experimental validation suggest that multiple residue substitutions are required to produce the thermodynamic and functional distinction between XylE and GLUT1. Despite the lack of simulation studies with substrates, these computational and biochemical characterizations provide unprecedented insight into the mechanistic difference between proton symporters and uniporters. PMID:28617850

  15. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    PubMed

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  16. Development of digital phantoms based on a finite element model to simulate low-attenuation areas in CT imaging for pulmonary emphysema quantification.

    PubMed

    Diciotti, Stefano; Nobis, Alessandro; Ciulli, Stefano; Landini, Nicholas; Mascalchi, Mario; Sverzellati, Nicola; Innocenti, Bernardo

    2017-09-01

    To develop an innovative finite element (FE) model of lung parenchyma which simulates pulmonary emphysema on CT imaging. The model is aimed to generate a set of digital phantoms of low-attenuation areas (LAA) images with different grades of emphysema severity. Four individual parameter configurations simulating different grades of emphysema severity were utilized to generate 40 FE models using ten randomizations for each setting. We compared two measures of emphysema severity (relative area (RA) and the exponent D of the cumulative distribution function of LAA clusters size) between the simulated LAA images and those computed directly on the models output (considered as reference). The LAA images obtained from our model output can simulate CT-LAA images in subjects with different grades of emphysema severity. Both RA and D computed on simulated LAA images were underestimated as compared to those calculated on the models output, suggesting that measurements in CT imaging may not be accurate in the assessment of real emphysema extent. Our model is able to mimic the cluster size distribution of LAA on CT imaging of subjects with pulmonary emphysema. The model could be useful to generate standard test images and to design physical phantoms of LAA images for the assessment of the accuracy of indexes for the radiologic quantitation of emphysema.

  17. Estimating patient-specific soft-tissue properties in a TKA knee.

    PubMed

    Ewing, Joseph A; Kaufman, Michelle K; Hutter, Erin E; Granger, Jeffrey F; Beal, Matthew D; Piazza, Stephen J; Siston, Robert A

    2016-03-01

    Surgical technique is one factor that has been identified as critical to success of total knee arthroplasty. Researchers have shown that computer simulations can aid in determining how decisions in the operating room generally affect post-operative outcomes. However, to use simulations to make clinically relevant predictions about knee forces and motions for a specific total knee patient, patient-specific models are needed. This study introduces a methodology for estimating knee soft-tissue properties of an individual total knee patient. A custom surgical navigation system and stability device were used to measure the force-displacement relationship of the knee. Soft-tissue properties were estimated using a parameter optimization that matched simulated tibiofemoral kinematics with experimental tibiofemoral kinematics. Simulations using optimized ligament properties had an average root mean square error of 3.5° across all tests while simulations using generic ligament properties taken from literature had an average root mean square error of 8.4°. Specimens showed large variability among ligament properties regardless of similarities in prosthetic component alignment and measured knee laxity. These results demonstrate the importance of soft-tissue properties in determining knee stability, and suggest that to make clinically relevant predictions of post-operative knee motions and forces using computer simulations, patient-specific soft-tissue properties are needed. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  18. Finite-Size Effects of Binary Mutual Diffusion Coefficients from Molecular Dynamics

    PubMed Central

    2018-01-01

    Molecular dynamics simulations were performed for the prediction of the finite-size effects of Maxwell-Stefan diffusion coefficients of molecular mixtures and a wide variety of binary Lennard–Jones systems. A strong dependency of computed diffusivities on the system size was observed. Computed diffusivities were found to increase with the number of molecules. We propose a correction for the extrapolation of Maxwell–Stefan diffusion coefficients to the thermodynamic limit, based on the study by Yeh and Hummer (J. Phys. Chem. B, 2004, 108, 15873−15879). The proposed correction is a function of the viscosity of the system, the size of the simulation box, and the thermodynamic factor, which is a measure for the nonideality of the mixture. Verification is carried out for more than 200 distinct binary Lennard–Jones systems, as well as 9 binary systems of methanol, water, ethanol, acetone, methylamine, and carbon tetrachloride. Significant deviations between finite-size Maxwell–Stefan diffusivities and the corresponding diffusivities at the thermodynamic limit were found for mixtures close to demixing. In these cases, the finite-size correction can be even larger than the simulated (finite-size) Maxwell–Stefan diffusivity. Our results show that considering these finite-size effects is crucial and that the suggested correction allows for reliable computations. PMID:29664633

  19. Older People's Perceptions of Pedestrian Friendliness and Traffic Safety: An Experiment Using Computer-Simulated Walking Environments.

    PubMed

    Kahlert, Daniela; Schlicht, Wolfgang

    2015-08-21

    Traffic safety and pedestrian friendliness are considered to be important conditions for older people's motivation to walk through their environment. This study uses an experimental study design with computer-simulated living environments to investigate the effect of micro-scale environmental factors (parking spaces and green verges with trees) on older people's perceptions of both motivational antecedents (dependent variables). Seventy-four consecutively recruited older people were randomly assigned watching one of two scenarios (independent variable) on a computer screen. The scenarios simulated a stroll on a sidewalk, as it is 'typical' for a German city. In version 'A,' the subjects take a fictive walk on a sidewalk where a number of cars are parked partially on it. In version 'B', cars are in parking spaces separated from the sidewalk by grass verges and trees. Subjects assessed their impressions of both dependent variables. A multivariate analysis of covariance showed that subjects' ratings on perceived traffic safety and pedestrian friendliness were higher for Version 'B' compared to version 'A'. Cohen's d indicates medium (d = 0.73) and large (d = 1.23) effect sizes for traffic safety and pedestrian friendliness, respectively. The study suggests that elements of the built environment might affect motivational antecedents of older people's walking behavior.

  20. Assessment of Near-Field Sonic Boom Simulation Tools

    NASA Technical Reports Server (NTRS)

    Casper, J. H.; Cliff, S. E.; Thomas, S. D.; Park, M. A.; McMullen, M. S.; Melton, J. E.; Durston, D. A.

    2008-01-01

    A recent study for the Supersonics Project, within the National Aeronautics and Space Administration, has been conducted to assess current in-house capabilities for the prediction of near-field sonic boom. Such capabilities are required to simulate the highly nonlinear flow near an aircraft, wherein a sonic-boom signature is generated. There are many available computational fluid dynamics codes that could be used to provide the near-field flow for a sonic boom calculation. However, such codes have typically been developed for applications involving aerodynamic configuration, for which an efficiently generated computational mesh is usually not optimum for a sonic boom prediction. Preliminary guidelines are suggested to characterize a state-of-the-art sonic boom prediction methodology. The available simulation tools that are best suited to incorporate into that methodology are identified; preliminary test cases are presented in support of the selection. During this phase of process definition and tool selection, parallel research was conducted in an attempt to establish criteria that link the properties of a computational mesh to the accuracy of a sonic boom prediction. Such properties include sufficient grid density near shocks and within the zone of influence, which are achieved by adaptation and mesh refinement strategies. Prediction accuracy is validated by comparison with wind tunnel data.

  1. A Computational Model for Path Loss in Wireless Sensor Networks in Orchard Environments

    PubMed Central

    Anastassiu, Hristos T.; Vougioukas, Stavros; Fronimos, Theodoros; Regen, Christian; Petrou, Loukas; Zude, Manuela; Käthner, Jana

    2014-01-01

    A computational model for radio wave propagation through tree orchards is presented. Trees are modeled as collections of branches, geometrically approximated by cylinders, whose dimensions are determined on the basis of measurements in a cherry orchard. Tree canopies are modeled as dielectric spheres of appropriate size. A single row of trees was modeled by creating copies of a representative tree model positioned on top of a rectangular, lossy dielectric slab that simulated the ground. The complete scattering model, including soil and trees, enhanced by periodicity conditions corresponding to the array, was characterized via a commercial computational software tool for simulating the wave propagation by means of the Finite Element Method. The attenuation of the simulated signal was compared to measurements taken in the cherry orchard, using two ZigBee receiver-transmitter modules. Near the top of the tree canopies (at 3 m), the predicted attenuation was close to the measured one—just slightly underestimated. However, at 1.5 m the solver underestimated the measured attenuation significantly, especially when leaves were present and, as distances grew longer. This suggests that the effects of scattering from neighboring tree rows need to be incorporated into the model. However, complex geometries result in ill conditioned linear systems that affect the solver's convergence. PMID:24625738

  2. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

    PubMed

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-08

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  3. Evaluation of the impact of carotid artery bifurcation angle on hemodynamics by use of computational fluid dynamics: a simulation and volunteer study.

    PubMed

    Saho, Tatsunori; Onishi, Hideo

    2016-07-01

    In this study, we evaluated the hemodynamics of carotid artery bifurcation with various geometries using simulated and volunteer models based on magnetic resonance imaging (MRI). Computational fluid dynamics (CFD) was analyzed by use of OpenFOAM. The velocity distribution, streamline, and wall shear stress (WSS) were evaluated in a simulated model with known bifurcation angles (30°, 40°, 50°, 60°, derived from patients' data) and in three-dimensional (3D) healthy volunteer models. Separated flow was observed at the outer side of the bifurcation, and large bifurcation models represented upstream transfer of the point. Local WSS values at the outer bifurcation [both simulated (<30 Pa) and volunteer (<50 Pa) models] were lower than those in the inner region (>100 Pa). The bifurcation angle had a significant negative correlation with the WSS value (p<0.05). The results of this study show that the carotid artery bifurcation angle is related to the WSS value. This suggests that hemodynamic stress can be estimated based on the carotid artery geometry. The construction of a clinical database for estimation of developing atherosclerosis is warranted.

  4. Free-energy landscape of intrinsically disordered proteins investigated by all-atom multicanonical molecular dynamics.

    PubMed

    Higo, Junichi; Umezawa, Koji

    2014-01-01

    We introduce computational studies on intrinsically disordered proteins (IDPs). Especially, we present our multicanonical molecular dynamics (McMD) simulations of two IDP-partner systems: NRSF-mSin3 and pKID-KIX. McMD is one of enhanced conformational sampling methods useful for conformational sampling of biomolecular systems. IDP adopts a specific tertiary structure upon binding to its partner molecule, although it is unstructured in the unbound state (i.e. the free state). This IDP-specific property is called "coupled folding and binding". The McMD simulation treats the biomolecules with an all-atom model immersed in an explicit solvent. In the initial configuration of simulation, IDP and its partner molecules are set to be distant from each other, and the IDP conformation is disordered. The computationally obtained free-energy landscape for coupled folding and binding has shown that native- and non-native-complex clusters distribute complicatedly in the conformational space. The all-atom simulation suggests that both of induced-folding and population-selection are coupled complicatedly in the coupled folding and binding. Further analyses have exemplified that the conformational fluctuations (dynamical flexibility) in the bound and unbound states are essentially important to characterize IDP functioning.

  5. pH during non-synaptic epileptiform activity-computational simulations.

    PubMed

    Rodrigues, Antônio Márcio; Santos, Luiz Eduardo Canton; Covolan, Luciene; Hamani, Clement; de Almeida, Antônio-Carlos Guimarães

    2015-09-02

    The excitability of neuronal networks is strongly modulated by changes in pH. The origin of these changes, however, is still under debate. The high complexity of neural systems justifies the use of computational simulation to investigate mechanisms that are possibly involved. Simulated neuronal activity includes non-synaptic epileptiform events (NEA) induced in hippocampal slices perfused with high-K(+) and zero-Ca(2+), therefore in the absence of the synaptic circuitry. A network of functional units composes the NEA model. Each functional unit represents one interface of neuronal/extracellular space/glial segments. Each interface contains transmembrane ionic transports, such as ionic channels, cotransporters, exchangers and pumps. Neuronal interconnections are mediated by gap-junctions, electric field effects and extracellular ionic fluctuations modulated by extracellular electrodiffusion. Mechanisms investigated are those that change intracellular and extracellular ionic concentrations and are able to affect [H(+)]. Our simulations suggest that the intense fluctuations in intra and extracellular concentrations of Na(+), K(+) and Cl(-) that accompany NEA are able to affect the combined action of the Na(+)/H(+) exchanger (NHE), [HCO(-)(3)]/Cl(-) exchanger (HCE), H(+) pump and the catalytic activity of intra and extracellular carbonic anhydrase. Cellular volume changes and extracellular electrodiffusion are responsible for modulating pH.

  6. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    PubMed

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  7. Investigating European genetic history through computer simulations.

    PubMed

    Currat, Mathias; Silva, Nuno M

    2013-01-01

    The genetic diversity of Europeans has been shaped by various evolutionary forces including their demographic history. Genetic data can thus be used to draw inferences on the population history of Europe using appropriate statistical methods such as computer simulation, which constitutes a powerful tool to study complex models. Here, we focus on spatially explicit simulation, a method which takes population movements over space and time into account. We present its main principles and then describe a series of studies using this approach that we consider as particularly significant in the context of European prehistory. All simulation studies agree that ancient demographic events played a significant role in the establishment of the European gene pool; but while earlier works support a major genetic input from the Near East during the Neolithic transition, the most recent ones revalue positively the contribution of pre-Neolithic hunter-gatherers and suggest a possible impact of very ancient demographic events. This result of a substantial genetic continuity from pre-Neolithic times to the present challenges some recent studies analyzing ancient DNA. We discuss the possible reasons for this discrepancy and identify future lines of investigation in order to get a better understanding of European evolution.

  8. Creating, documenting and sharing network models.

    PubMed

    Crook, Sharon M; Bednar, James A; Berger, Sandra; Cannon, Robert; Davison, Andrew P; Djurfeldt, Mikael; Eppler, Jochen; Kriener, Birgit; Furber, Steve; Graham, Bruce; Plesser, Hans E; Schwabe, Lars; Smith, Leslie; Steuber, Volker; van Albada, Sacha

    2012-01-01

    As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.

  9. Computational Fluid Dynamics Demonstration of Rigid Bodies in Motion

    NASA Technical Reports Server (NTRS)

    Camarena, Ernesto; Vu, Bruce T.

    2011-01-01

    The Design Analysis Branch (NE-Ml) at the Kennedy Space Center has not had the ability to accurately couple Rigid Body Dynamics (RBD) and Computational Fluid Dynamics (CFD). OVERFLOW-D is a flow solver that has been developed by NASA to have the capability to analyze and simulate dynamic motions with up to six Degrees of Freedom (6-DOF). Two simulations were prepared over the course of the internship to demonstrate 6DOF motion of rigid bodies under aerodynamic loading. The geometries in the simulations were based on a conceptual Space Launch System (SLS). The first simulation that was prepared and computed was the motion of a Solid Rocket Booster (SRB) as it separates from its core stage. To reduce computational time during the development of the simulation, only half of the physical domain with respect to the symmetry plane was simulated. Then a full solution was prepared and computed. The second simulation was a model of the SLS as it departs from a launch pad under a 20 knot crosswind. This simulation was reduced to Two Dimensions (2D) to reduce both preparation and computation time. By allowing 2-DOF for translations and 1-DOF for rotation, the simulation predicted unrealistic rotation. The simulation was then constrained to only allow translations.

  10. Characterizing rare-event property distributions via replicate molecular dynamics simulations of proteins.

    PubMed

    Krishnan, Ranjani; Walton, Emily B; Van Vliet, Krystyn J

    2009-11-01

    As computational resources increase, molecular dynamics simulations of biomolecules are becoming an increasingly informative complement to experimental studies. In particular, it has now become feasible to use multiple initial molecular configurations to generate an ensemble of replicate production-run simulations that allows for more complete characterization of rare events such as ligand-receptor unbinding. However, there are currently no explicit guidelines for selecting an ensemble of initial configurations for replicate simulations. Here, we use clustering analysis and steered molecular dynamics simulations to demonstrate that the configurational changes accessible in molecular dynamics simulations of biomolecules do not necessarily correlate with observed rare-event properties. This informs selection of a representative set of initial configurations. We also employ statistical analysis to identify the minimum number of replicate simulations required to sufficiently sample a given biomolecular property distribution. Together, these results suggest a general procedure for generating an ensemble of replicate simulations that will maximize accurate characterization of rare-event property distributions in biomolecules.

  11. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  12. Processing for spaceborne synthetic aperture radar imagery

    NASA Technical Reports Server (NTRS)

    Lybanon, M.

    1973-01-01

    The data handling and processing in using synthetic aperture radar as a satellite-borne earth resources remote sensor is considered. The discussion covers the nature of the problem, the theory, both conventional and potential advanced processing techniques, and a complete computer simulation. It is shown that digital processing is a real possibility and suggests some future directions for research.

  13. Simulating Single Word Processing in the Classic Aphasia Syndromes Based on the Wernicke-Lichtheim-Geschwind Theory

    ERIC Educational Resources Information Center

    Weems, Scott A.; Reggia, James A.

    2006-01-01

    The Wernicke-Lichtheim-Geschwind (WLG) theory of the neurobiological basis of language is of great historical importance, and it continues to exert a substantial influence on most contemporary theories of language in spite of its widely recognized limitations. Here, we suggest that neurobiologically grounded computational models based on the WLG…

  14. Structuring Assignments to Improve Understanding and Presentation Skills: Experiential Learning in the Capstone Strategic Management Team Presentation

    ERIC Educational Resources Information Center

    Helms, Marilyn M.; Whitesell, Melissa

    2017-01-01

    In the strategic management course, students select, analyze, and present viable future alternatives based on information provided in cases or computer simulations. Rather than understanding the entire process, the student's focus is on the final presentation. Chickering's (1977) research on active learning suggests students learn more effectively…

  15. Managing emergency department overcrowding via ambulance diversion: a discrete event simulation model.

    PubMed

    Lin, Chih-Hao; Kao, Chung-Yao; Huang, Chong-Ye

    2015-01-01

    Ambulance diversion (AD) is considered one of the possible solutions to relieve emergency department (ED) overcrowding. Study of the effectiveness of various AD strategies is prerequisite for policy-making. Our aim is to develop a tool that quantitatively evaluates the effectiveness of various AD strategies. A simulation model and a computer simulation program were developed. Three sets of simulations were executed to evaluate AD initiating criteria, patient-blocking rules, and AD intervals, respectively. The crowdedness index, the patient waiting time for service, and the percentage of adverse patients were assessed to determine the effect of various AD policies. Simulation results suggest that, in a certain setting, the best timing for implementing AD is when the crowdedness index reaches the critical value, 1.0 - an indicator that ED is operating at its maximal capacity. The strategy to divert all patients transported by ambulance is more effective than to divert either high-acuity patients only or low-acuity patients only. Given a total allowable AD duration, implementing AD multiple times with short intervals generally has better effect than having a single AD with maximal allowable duration. An input-throughput-output simulation model is proposed for simulating ED operation. Effectiveness of several AD strategies on relieving ED overcrowding was assessed via computer simulations based on this model. By appropriate parameter settings, the model can represent medical resource providers of different scales. It is also feasible to expand the simulations to evaluate the effect of AD strategies on a community basis. The results may offer insights for making effective AD policies. Copyright © 2012. Published by Elsevier B.V.

  16. Simulation Accelerator

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under a NASA SBIR (Small Business Innovative Research) contract, (NAS5-30905), EAI Simulation Associates, Inc., developed a new digital simulation computer, Starlight(tm). With an architecture based on the analog model of computation, Starlight(tm) outperforms all other computers on a wide range of continuous system simulation. This system is used in a variety of applications, including aerospace, automotive, electric power and chemical reactors.

  17. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  18. The Role of Computer Simulation in an Inquiry-Based Learning Environment: Reconstructing Geological Events as Geologists

    ERIC Educational Resources Information Center

    Lin, Li-Fen; Hsu, Ying-Shao; Yeh, Yi-Fen

    2012-01-01

    Several researchers have investigated the effects of computer simulations on students' learning. However, few have focused on how simulations with authentic contexts influences students' inquiry skills. Therefore, for the purposes of this study, we developed a computer simulation (FossilSim) embedded in an authentic inquiry lesson. FossilSim…

  19. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  20. MEFA (multiepitope fusion antigen)-Novel Technology for Structural Vaccinology, Proof from Computational and Empirical Immunogenicity Characterization of an Enterotoxigenic Escherichia coli (ETEC) Adhesin MEFA

    PubMed Central

    Duan, Qiangde; Lee, Kuo Hao; Nandre, Rahul M; Garcia, Carolina; Chen, Jianhan; Zhang, Weiping

    2017-01-01

    Vaccine development often encounters the challenge of virulence heterogeneity. Enterotoxigenic Escherichia coli (ETEC) bacteria producing immunologically heterogeneous virulence factors are a leading cause of children’s diarrhea and travelers’ diarrhea. Currently, we do not have licensed vaccines against ETEC bacteria. While conventional methods continue to make progress but encounter challenge, new computational and structure-based approaches are explored to accelerate ETEC vaccine development. In this study, we applied a structural vaccinology concept to construct a structure-based multiepitope fusion antigen (MEFA) to carry representing epitopes of the seven most important ETEC adhesins [CFA/I, CFA/II (CS1–CS3), CFA/IV (CS4–CS6)], simulated antigenic structure of the CFA/I/II/IV MEFA with computational atomistic modeling and simulation, characterized immunogenicity in mouse immunization, and examined the potential of structure-informed vaccine design for ETEC vaccine development. A tag-less recombinant MEFA protein (CFA/I/II/IV MEFA) was effectively expressed and extracted. Molecular dynamics simulations indicated that this MEFA immunogen maintained a stable secondary structure and presented epitopes on the protein surface. Empirical data showed that mice immunized with the tagless CFA/I/II/IV MEFA developed strong antigen-specific antibody responses, and mouse serum antibodies significantly inhibited in vitro adherence of bacteria expressing these seven adhesins. These results revealed congruence of antigen immunogenicity between computational simulation and empirical mouse immunization and indicated this tag-less CFA/I/II/IV MEFA potentially an antigen for a broadly protective ETEC vaccine, suggesting a potential application of MEFA-based structural vaccinology for vaccine design against ETEC and likely other pathogens. PMID:28944092

  1. A virtual surgical training system that simulates cutting of soft tissue using a modified pre-computed elastic model.

    PubMed

    Toe, Kyaw Kyar; Huang, Weimin; Yang, Tao; Duan, Yuping; Zhou, Jiayin; Su, Yi; Teo, Soo-Kng; Kumar, Selvaraj Senthil; Lim, Calvin Chi-Wan; Chui, Chee Kong; Chang, Stephen

    2015-08-01

    This work presents a surgical training system that incorporates cutting operation of soft tissue simulated based on a modified pre-computed linear elastic model in the Simulation Open Framework Architecture (SOFA) environment. A precomputed linear elastic model used for the simulation of soft tissue deformation involves computing the compliance matrix a priori based on the topological information of the mesh. While this process may require a few minutes to several hours, based on the number of vertices in the mesh, it needs only to be computed once and allows real-time computation of the subsequent soft tissue deformation. However, as the compliance matrix is based on the initial topology of the mesh, it does not allow any topological changes during simulation, such as cutting or tearing of the mesh. This work proposes a way to modify the pre-computed data by correcting the topological connectivity in the compliance matrix, without re-computing the compliance matrix which is computationally expensive.

  2. Pathways for virus assembly around nucleic acids

    PubMed Central

    Perlmutter, Jason D; Perkett, Matthew R

    2014-01-01

    Understanding the pathways by which viral capsid proteins assemble around their genomes could identify key intermediates as potential drug targets. In this work we use computer simulations to characterize assembly over a wide range of capsid protein-protein interaction strengths and solution ionic strengths. We find that assembly pathways can be categorized into two classes, in which intermediates are either predominantly ordered or disordered. Our results suggest that estimating the protein-protein and the protein-genome binding affinities may be sufficient to predict which pathway occurs. Furthermore, the calculated phase diagrams suggest that knowledge of the dominant assembly pathway and its relationship to control parameters could identify optimal strategies to thwart or redirect assembly to block infection. Finally, analysis of simulation trajectories suggests that the two classes of assembly pathways can be distinguished in single molecule fluorescence correlation spectroscopy or bulk time resolved small angle x-ray scattering experiments. PMID:25036288

  3. Are false-positive rates leading to an overestimation of noise-induced hearing loss?

    PubMed

    Schlauch, Robert S; Carney, Edward

    2011-04-01

    To estimate false-positive rates for rules proposed to identify early noise-induced hearing loss (NIHL) using the presence of notches in audiograms. Audiograms collected from school-age children in a national survey of health and nutrition (the Third National Health and Nutrition Examination Survey [NHANES III]; National Center for Health Statistics, 1994) were examined using published rules for identifying noise notches at various pass-fail criteria. These results were compared with computer-simulated "flat" audiograms. The proportion of these identified as having a noise notch is an estimate of the false-positive rate for a particular rule. Audiograms from the NHANES III for children 6-11 years of age yielded notched audiograms at rates consistent with simulations, suggesting that this group does not have significant NIHL. Further, pass-fail criteria for rules suggested by expert clinicians, applied to NHANES III audiometric data, yielded unacceptably high false-positive rates. Computer simulations provide an effective method for estimating false-positive rates for protocols used to identify notched audiograms. Audiometric precision could possibly be improved by (a) eliminating systematic calibration errors, including a possible problem with reference levels for TDH-style earphones; (b) repeating and averaging threshold measurements; and (c) using earphones that yield lower variability for 6.0 and 8.0 kHz--2 frequencies critical for identifying noise notches.

  4. The Apparent Critical Decay Index at the Onset of Solar Prominence Eruptions

    NASA Astrophysics Data System (ADS)

    Zuccarello, F. P.; Aulanier, G.; Gilchrist, S. A.

    2016-04-01

    A magnetic flux rope (MFR) embedded in a line-tied external magnetic field that decreases with height as {z}-n is unstable to perturbations if the decay index of the field n is larger than a critical value. The onset of this instability, called torus instability, is one of the main mechanisms that can initiate coronal mass ejections. Since flux ropes often possess magnetic dips that can support prominence plasma, this is also a valuable mechanism to trigger prominence eruptions. Magnetohydrodynamic (MHD) simulations of the formation and/or emergence of MFRs suggest a critical value for the onset of the instability in the range [1.4-2]. However, detailed observations of prominences suggest a value in the range [0.9-1.1]. In this Letter, by using a set of MHD simulations, we show why the large discrepancy between models and observations is only apparent. Our simulations indeed show that the critical decay index at the onset of the eruption is n=1.4+/- 0.1 when computed at the apex of the flux rope axis, while it is n=1.1+/- 0.1 when it is computed at the altitude of the topmost part of the distribution of magnetic dips. The discrepancy only arises because weakly twisted curved flux ropes do not have dips up to the altitude of their axis.

  5. Evolutionary Games of Multiplayer Cooperation on Graphs

    PubMed Central

    Arranz, Jordi; Traulsen, Arne

    2016-01-01

    There has been much interest in studying evolutionary games in structured populations, often modeled as graphs. However, most analytical results so far have only been obtained for two-player or linear games, while the study of more complex multiplayer games has been usually tackled by computer simulations. Here we investigate evolutionary multiplayer games on graphs updated with a Moran death-Birth process. For cycles, we obtain an exact analytical condition for cooperation to be favored by natural selection, given in terms of the payoffs of the game and a set of structure coefficients. For regular graphs of degree three and larger, we estimate this condition using a combination of pair approximation and diffusion approximation. For a large class of cooperation games, our approximations suggest that graph-structured populations are stronger promoters of cooperation than populations lacking spatial structure. Computer simulations validate our analytical approximations for random regular graphs and cycles, but show systematic differences for graphs with many loops such as lattices. In particular, our simulation results show that these kinds of graphs can even lead to more stringent conditions for the evolution of cooperation than well-mixed populations. Overall, we provide evidence suggesting that the complexity arising from many-player interactions and spatial structure can be captured by pair approximation in the case of random graphs, but that it need to be handled with care for graphs with high clustering. PMID:27513946

  6. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    ERIC Educational Resources Information Center

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  7. Computing the Rotational Diffusion of Biomolecules via Molecular Dynamics Simulation and Quaternion Orientations.

    PubMed

    Chen, Po-Chia; Hologne, Maggy; Walker, Olivier

    2017-03-02

    Rotational diffusion (D rot ) is a fundamental property of biomolecules that contains information about molecular dimensions and solute-solvent interactions. While ab initio D rot prediction can be achieved by explicit all-atom molecular dynamics simulations, this is hindered by both computational expense and limitations in water models. We propose coarse-grained force fields as a complementary solution, and show that the MARTINI force field with elastic networks is sufficient to compute D rot in >10 proteins spanning 5-157 kDa. We also adopt a quaternion-based approach that computes D rot orientation directly from autocorrelations of best-fit rotations as used in, e.g., RMSD algorithms. Over 2 μs trajectories, isotropic MARTINI+EN tumbling replicates experimental values to within 10-20%, with convergence analyses suggesting a minimum sampling of >50 × τ theor to achieve sufficient precision. Transient fluctuations in anisotropic tumbling cause decreased precision in predictions of axisymmetric anisotropy and rhombicity, the latter of which cannot be precisely evaluated within 2000 × τ theor for GB3. Thus, we encourage reporting of axial decompositions D x , D y , D z to ease comparability between experiment and simulation. Where protein disorder is absent, we observe close replication of MARTINI+EN D rot orientations versus CHARMM22*/TIP3p and experimental data. This work anticipates the ab initio prediction of NMR-relaxation by combining coarse-grained global motions with all-atom local motions.

  8. A Comparative Study of Simulated and Measured Main Landing Gear Noise for Large Civil Transports

    NASA Technical Reports Server (NTRS)

    Konig, Benedikt; Fares, Ehab; Ravetta, Patricio; Khorrami, Mehdi R.

    2017-01-01

    Computational results for the NASA 26%-scale model of a six-wheel main landing gear with and without a toboggan-shaped noise reduction fairing are presented. The model is a high-fidelity representation of a Boeing 777-200 aircraft main landing gear. A lattice Boltzmann method was used to simulate the unsteady flow around the model in isolation. The computations were conducted in free-air at a Mach number of 0.17, matching a recent acoustic test of the same gear model in the Virginia Tech Stability Wind Tunnel in its anechoic configuration. Results obtained on a set of grids with successively finer spatial resolution demonstrate the challenge in resolving/capturing the flow field for the smaller components of the gear and their associated interactions, and the resulting effects on the high-frequency segment of the farfield noise spectrum. Farfield noise spectra were computed based on an FWH integral approach, with simulated pressures on the model solid surfaces or flow-field data extracted on a set of permeable surfaces enclosing the model as input. Comparison of these spectra with microphone array measurements obtained in the tunnel indicated that, for the present complex gear model, the permeable surfaces provide a more accurate representation of farfield noise, suggesting that volumetric effects are not negligible. The present study also demonstrates that good agreement between simulated and measured farfield noise can be achieved if consistent post-processing is applied to both physical and synthetic pressure records at array microphone locations.

  9. An Intelligent Computer-aided Training System (CAT) for Diagnosing Adult Illiterates: Integrating NASA Technology into Workplace Literacy

    NASA Technical Reports Server (NTRS)

    Yaden, David B., Jr.

    1991-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application being developed is The Adult Literacy Evaluator, a simulation-based diagnostic tool designed to assess the operant literacy abilities of adults having difficulties in learning to read and write. Using Intelligent Computer-Aided Training (ICAT) system technology in addition to speech recognition, closed-captioned television (CCTV), live video and other state-of-the-art graphics and storage capabilities, this project attempts to overcome the negative effects of adult literacy assessment by allowing the client to interact with an intelligent computer system which simulates real-life literacy activities and materials and which measures literacy performance in the actual context of its use. The specific objectives of the project are as follows: (1) to develop a simulation-based diagnostic tool to assess adults' prior knowledge about reading and writing processes in actual contexts of application; (2) to provide a profile of readers' strengths and weaknesses; and (3) to suggest instructional strategies and materials which can be used as a beginning point for remediation. In the first and development phase of the project, descriptions of literacy events and environments are being written and functional literacy documents analyzed for their components. From these descriptions, scripts are being generated which define the interaction between the student, an on-screen guide and the simulated literacy environment.

  10. Effect of bulk modulus on deformation of the brain under rotational accelerations

    NASA Astrophysics Data System (ADS)

    Ganpule, S.; Daphalapurkar, N. P.; Cetingul, M. P.; Ramesh, K. T.

    2018-01-01

    Traumatic brain injury such as that developed as a consequence of blast is a complex injury with a broad range of symptoms and disabilities. Computational models of brain biomechanics hold promise for illuminating the mechanics of traumatic brain injury and for developing preventive devices. However, reliable material parameters are needed for models to be predictive. Unfortunately, the properties of human brain tissue are difficult to measure, and the bulk modulus of brain tissue in particular is not well characterized. Thus, a wide range of bulk modulus values are used in computational models of brain biomechanics, spanning up to three orders of magnitude in the differences between values. However, the sensitivity of these variations on computational predictions is not known. In this work, we study the sensitivity of a 3D computational human head model to various bulk modulus values. A subject-specific human head model was constructed from T1-weighted MRI images at 2-mm3 voxel resolution. Diffusion tensor imaging provided data on spatial distribution and orientation of axonal fiber bundles for modeling white matter anisotropy. Non-injurious, full-field brain deformations in a human volunteer were used to assess the simulated predictions. The comparison suggests that a bulk modulus value on the order of GPa gives the best agreement with experimentally measured in vivo deformations in the human brain. Further, simulations of injurious loading suggest that bulk modulus values on the order of GPa provide the closest match with the clinical findings in terms of predicated injured regions and extent of injury.

  11. Towards a comprehensive framework for cosimulation of dynamic models with an emphasis on time stepping

    NASA Astrophysics Data System (ADS)

    Hoepfer, Matthias

    Over the last two decades, computer modeling and simulation have evolved as the tools of choice for the design and engineering of dynamic systems. With increased system complexities, modeling and simulation become essential enablers for the design of new systems. Some of the advantages that modeling and simulation-based system design allows for are the replacement of physical tests to ensure product performance, reliability and quality, the shortening of design cycles due to the reduced need for physical prototyping, the design for mission scenarios, the invoking of currently nonexisting technologies, and the reduction of technological and financial risks. Traditionally, dynamic systems are modeled in a monolithic way. Such monolithic models include all the data, relations and equations necessary to represent the underlying system. With increased complexity of these models, the monolithic model approach reaches certain limits regarding for example, model handling and maintenance. Furthermore, while the available computer power has been steadily increasing according to Moore's Law (a doubling in computational power every 10 years), the ever-increasing complexities of new models have negated the increased resources available. Lastly, modern systems and design processes are interdisciplinary, enforcing the necessity to make models more flexible to be able to incorporate different modeling and design approaches. The solution to bypassing the shortcomings of monolithic models is cosimulation. In a very general sense, co-simulation addresses the issue of linking together different dynamic sub-models to a model which represents the overall, integrated dynamic system. It is therefore an important enabler for the design of interdisciplinary, interconnected, highly complex dynamic systems. While a basic co-simulation setup can be very easy, complications can arise when sub-models display behaviors such as algebraic loops, singularities, or constraints. This work frames the co-simulation approach to modeling and simulation. It lays out the general approach to dynamic system co-simulation, and gives a comprehensive overview of what co-simulation is and what it is not. It creates a taxonomy of the requirements and limits of co-simulation, and the issues arising with co-simulating sub-models. Possible solutions towards resolving the stated problems are investigated to a certain depth. A particular focus is given to the issue of time stepping. It will be shown that for dynamic models, the selection of the simulation time step is a crucial issue with respect to computational expense, simulation accuracy, and error control. The reasons for this are discussed in depth, and a time stepping algorithm for co-simulation with unknown dynamic sub-models is proposed. Motivations and suggestions for the further treatment of selected issues are presented.

  12. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  13. Large eddy simulation in a rotary blood pump: Viscous shear stress computation and comparison with unsteady Reynolds-averaged Navier-Stokes simulation.

    PubMed

    Torner, Benjamin; Konnigk, Lucas; Hallier, Sebastian; Kumar, Jitendra; Witte, Matthias; Wurm, Frank-Hendrik

    2018-06-01

    Numerical flow analysis (computational fluid dynamics) in combination with the prediction of blood damage is an important procedure to investigate the hemocompatibility of a blood pump, since blood trauma due to shear stresses remains a problem in these devices. Today, the numerical damage prediction is conducted using unsteady Reynolds-averaged Navier-Stokes simulations. Investigations with large eddy simulations are rarely being performed for blood pumps. Hence, the aim of the study is to examine the viscous shear stresses of a large eddy simulation in a blood pump and compare the results with an unsteady Reynolds-averaged Navier-Stokes simulation. The simulations were carried out at two operation points of a blood pump. The flow was simulated on a 100M element mesh for the large eddy simulation and a 20M element mesh for the unsteady Reynolds-averaged Navier-Stokes simulation. As a first step, the large eddy simulation was verified by analyzing internal dissipative losses within the pump. Then, the pump characteristics and mean and turbulent viscous shear stresses were compared between the two simulation methods. The verification showed that the large eddy simulation is able to reproduce the significant portion of dissipative losses, which is a global indication that the equivalent viscous shear stresses are adequately resolved. The comparison with the unsteady Reynolds-averaged Navier-Stokes simulation revealed that the hydraulic parameters were in agreement, but differences for the shear stresses were found. The results show the potential of the large eddy simulation as a high-quality comparative case to check the suitability of a chosen Reynolds-averaged Navier-Stokes setup and turbulence model. Furthermore, the results lead to suggest that large eddy simulations are superior to unsteady Reynolds-averaged Navier-Stokes simulations when instantaneous stresses are applied for the blood damage prediction.

  14. Aerothermal Analysis of the Project Fire II Afterbody Flow

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Loomis, Mark; Papadopoulos, Periklis; Arnold, James O. (Technical Monitor)

    2001-01-01

    Computational fluid dynamics (CFD) is used to simulate the wake flow and afterbody heating of the Project Fire II ballistic reentry to Earth at 11.4 km/sec. Laminar results are obtained over a portion of the trajectory between the initial heat pulse and peak afterbody heating. Although non-catalytic forebody convective heating results are in excellent agreement with previous computations, initial predictions of afterbody heating were about a factor of two below the experimental values. Further analysis suggests that significant catalysis may be occurring on the afterbody heat shield. Computations including finite-rate catalysis on the afterbody surface are in good agreement with the data over the early portion of the trajectory, but are conservative near the peak afterbody heating point, especially on the rear portion of the conical frustum. Further analysis of the flight data from Fire II shows that peak afterbody heating occurs before peak forebody heating, a result that contradicts computations and flight data from other entry vehicles. This result suggests that another mechanism, possibly pyrolysis, may be occurring during the later portion of the trajectory, resulting in less total heat transfer than the current predictions.

  15. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  16. Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Herrick, Gregory P.; Chen, Jen-Ping

    2012-01-01

    This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.

  17. The effectiveness of using multimedia computer simulations coupled with social constructivist pedagogy in a college introductory physics classroom

    NASA Astrophysics Data System (ADS)

    Chou, Chiu-Hsiang

    Electricity and Magnetism is legendarily considered a subject incomprehensible to the students in the college introductory level. From a social constructivist perspective, learners are encouraged to assess the quantity and the quality of prior knowledge in a subject domain and to co-construct shared knowledge and understanding by implementing and building on each other's ideas. They become challenged by new data and perspectives thus stimulate a reconceptualization of knowledge and to be actively engaged in discovering new meanings based on experiences grounded in the real-world phenomena they are expected to learn. This process is categorized as a conceptual change learning environment and can facilitate learning of E & M. Computer simulations are an excellent tool to assist the teacher and leaner in achieving these goals and were used in this study. This study examined the effectiveness of computer simulations within a conceptual change learning environment and compared it to more lecture-centered, traditional ways of teaching E & M. An experimental and control group were compared and the following differences were observed. Statistic analyses were done with ANOVA (F-test). The results indicated that the treatment group significantly outperformed the control group on the achievement test, F(1,54) = 12.34, p <.05 and the treatment group had a higher rate of improvement than the control group on two subscales: Isolation of Variables and Abstract Transformation. The results from the Maryland Physics Expectations Survey (MPEX) showed that the treatment students became more field independent and were aware of more fundamental role played by physics concepts in complex problem solving. The protocol analysis of structured interviews revealed that students in the treatment group tended to visualize the problem from different aspects and articulated what they thought in a more scientific approach. Responses to the instructional evaluation questionnaire indicated overwhelming positive ratings of appropriateness and instructional effectiveness of computer simulation instruction. In conclusion, the CSI developed and evaluated in this study provided opportunities for students to refine their preconceptions and practice using new understandings. It suggests substantial promise for the computer simulation in a classroom environment.

  18. Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...

  19. Thermodynamic and transport properties of nitrogen fluid: Molecular theory and computer simulations

    NASA Astrophysics Data System (ADS)

    Eskandari Nasrabad, A.; Laghaei, R.

    2018-04-01

    Computer simulations and various theories are applied to compute the thermodynamic and transport properties of nitrogen fluid. To model the nitrogen interaction, an existing potential in the literature is modified to obtain a close agreement between the simulation results and experimental data for the orthobaric densities. We use the Generic van der Waals theory to calculate the mean free volume and apply the results within the modified Cohen-Turnbull relation to obtain the self-diffusion coefficient. Compared to experimental data, excellent results are obtained via computer simulations for the orthobaric densities, the vapor pressure, the equation of state, and the shear viscosity. We analyze the results of the theory and computer simulations for the various thermophysical properties.

  20. A mechanistic pan-cancer pathway model informed by multi-omics data interprets stochastic cell fate responses to drugs and mitogens

    PubMed Central

    Bouhaddou, Mehdi; Koch, Rick J.; DiStefano, Matthew S.; Tan, Annie L.; Mertz, Alex E.

    2018-01-01

    Most cancer cells harbor multiple drivers whose epistasis and interactions with expression context clouds drug and drug combination sensitivity prediction. We constructed a mechanistic computational model that is context-tailored by omics data to capture regulation of stochastic proliferation and death by pan-cancer driver pathways. Simulations and experiments explore how the coordinated dynamics of RAF/MEK/ERK and PI-3K/AKT kinase activities in response to synergistic mitogen or drug combinations control cell fate in a specific cellular context. In this MCF10A cell context, simulations suggest that synergistic ERK and AKT inhibitor-induced death is likely mediated by BIM rather than BAD, which is supported by prior experimental studies. AKT dynamics explain S-phase entry synergy between EGF and insulin, but simulations suggest that stochastic ERK, and not AKT, dynamics seem to drive cell-to-cell proliferation variability, which in simulations is predictable from pre-stimulus fluctuations in C-Raf/B-Raf levels. Simulations suggest MEK alteration negligibly influences transformation, consistent with clinical data. Tailoring the model to an alternate cell expression and mutation context, a glioma cell line, allows prediction of increased sensitivity of cell death to AKT inhibition. Our model mechanistically interprets context-specific landscapes between driver pathways and cell fates, providing a framework for designing more rational cancer combination therapy. PMID:29579036

  1. Development of an Output-based Adaptive Method for Multi-Dimensional Euler and Navier-Stokes Simulations

    NASA Technical Reports Server (NTRS)

    Darmofal, David L.

    2003-01-01

    The use of computational simulations in the prediction of complex aerodynamic flows is becoming increasingly prevalent in the design process within the aerospace industry. Continuing advancements in both computing technology and algorithmic development are ultimately leading to attempts at simulating ever-larger, more complex problems. However, by increasing the reliance on computational simulations in the design cycle, we must also increase the accuracy of these simulations in order to maintain or improve the reliability arid safety of the resulting aircraft. At the same time, large-scale computational simulations must be made more affordable so that their potential benefits can be fully realized within the design cycle. Thus, a continuing need exists for increasing the accuracy and efficiency of computational algorithms such that computational fluid dynamics can become a viable tool in the design of more reliable, safer aircraft. The objective of this research was the development of an error estimation and grid adaptive strategy for reducing simulation errors in integral outputs (functionals) such as lift or drag from from multi-dimensional Euler and Navier-Stokes simulations. In this final report, we summarize our work during this grant.

  2. Dynamics of the EAG1 K+ channel selectivity filter assessed by molecular dynamics simulations.

    PubMed

    Bernsteiner, Harald; Bründl, Michael; Stary-Weinzinger, Anna

    2017-02-26

    EAG1 channels belong to the KCNH family of voltage gated potassium channels. They are expressed in several brain regions and increased expression is linked to certain cancer types. Recent cryo-EM structure determination finally revealed the structure of these channels in atomic detail, allowing computational investigations. In this study, we performed molecular dynamics simulations to investigate the ion binding sites and the dynamical behavior of the selectivity filter. Our simulations suggest that sites S2 and S4 form stable ion binding sites, while ions placed at sites S1 and S3 rapidly switched to sites S2 and S4. Further, ions tended to dissociate away from S0 within less than 20 ns, due to increased filter flexibility. This was followed by water influx from the extracellular side, leading to a widening of the filter in this region, and likely non-conductive filter configurations. Simulations with the inactivation-enhancing mutant Y464A or Na + ions lead to trapped water molecules behind the SF, suggesting that these simulations captured early conformational changes linked to C-type inactivation. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Falls Risk and Simulated Driving Performance in Older Adults

    PubMed Central

    Gaspar, John G.; Neider, Mark B.; Kramer, Arthur F.

    2013-01-01

    Declines in executive function and dual-task performance have been related to falls in older adults, and recent research suggests that older adults at risk for falls also show impairments on real-world tasks, such as crossing a street. The present study examined whether falls risk was associated with driving performance in a high-fidelity simulator. Participants were classified as high or low falls risk using the Physiological Profile Assessment and completed a number of challenging simulated driving assessments in which they responded quickly to unexpected events. High falls risk drivers had slower response times (~2.1 seconds) to unexpected events compared to low falls risk drivers (~1.7 seconds). Furthermore, when asked to perform a concurrent cognitive task while driving, high falls risk drivers showed greater costs to secondary task performance than did low falls risk drivers, and low falls risk older adults also outperformed high falls risk older adults on a computer-based measure of dual-task performance. Our results suggest that attentional differences between high and low falls risk older adults extend to simulated driving performance. PMID:23509627

  4. Simulating Laboratory Procedures.

    ERIC Educational Resources Information Center

    Baker, J. E.; And Others

    1986-01-01

    Describes the use of computer assisted instruction in a medical microbiology course. Presents examples of how computer assisted instruction can present case histories in which the laboratory procedures are simulated. Discusses an authoring system used to prepare computer simulations and provides one example of a case history dealing with fractured…

  5. Inflight IFR procedures simulator

    NASA Technical Reports Server (NTRS)

    Parker, L. C. (Inventor)

    1984-01-01

    An inflight IFR procedures simulator for generating signals and commands to conventional instruments provided in an airplane is described. The simulator includes a signal synthesizer which generates predetermined simulated signals corresponding to signals normally received from remote sources upon being activated. A computer is connected to the signal synthesizer and causes the signal synthesizer to produce simulated signals responsive to programs fed into the computer. A switching network is connected to the signal synthesizer, the antenna of the aircraft, and navigational instruments and communication devices for selectively connecting instruments and devices to the synthesizer and disconnecting the antenna from the navigational instruments and communication device. Pressure transducers are connected to the altimeter and speed indicator for supplying electrical signals to the computer indicating the altitude and speed of the aircraft. A compass is connected for supply electrical signals for the computer indicating the heading of the airplane. The computer upon receiving signals from the pressure transducer and compass, computes the signals that are fed to the signal synthesizer which, in turn, generates simulated navigational signals.

  6. Experimental and computational investigation of lateral gauge response in polycarbonate

    NASA Astrophysics Data System (ADS)

    Eliot, Jim; Harris, Ernest Joseph; Hazell, Paul; Appleby-Thomas, Gareth James; Winter, Ron; Wood, David Christopher

    2012-03-01

    The shock behaviour of polycarbonate is of interest due to its extensive use in defence applications. Interestingly, embedded lateral manganin stress gauges in polycarbonate have shown gradients behind incident shocks, suggestive of increasing shear strength. However, such gauges are commonly embedded in a central epoxy interlayer. This is an inherently invasive approach. Recently, research has suggested that in such systems interlayer/target impedance may contribute to observed gradients in lateral stress. Here, experimental T-gauge (Vishay Micro-Measurements® type J2M-SS-580SF-025) traces from polycarbonate targets are compared to computational simulations. The effects of gauge environment are investigated by looking at the response of lateral gauges with both standard "glued-joint" and a "dry joint" encapsulation, where no encapsulating medium is employed.

  7. Computational structural mechanics engine structures computational simulator

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1989-01-01

    The Computational Structural Mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures.

  8. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing

    PubMed Central

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-01

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4× speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration. PMID:28075343

  9. Computational modelling of cellular level metabolism

    NASA Astrophysics Data System (ADS)

    Calvetti, D.; Heino, J.; Somersalo, E.

    2008-07-01

    The steady and stationary state inverse problems consist of estimating the reaction and transport fluxes, blood concentrations and possibly the rates of change of some of the concentrations based on data which are often scarce noisy and sampled over a population. The Bayesian framework provides a natural setting for the solution of this inverse problem, because a priori knowledge about the system itself and the unknown reaction fluxes and transport rates can compensate for the insufficiency of measured data, provided that the computational costs do not become prohibitive. This article identifies the computational challenges which have to be met when analyzing the steady and stationary states of multicompartment model for cellular metabolism and suggest stable and efficient ways to handle the computations. The outline of a computational tool based on the Bayesian paradigm for the simulation and analysis of complex cellular metabolic systems is also presented.

  10. How Effective Is Instructional Support for Learning with Computer Simulations?

    ERIC Educational Resources Information Center

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  11. SIGMA--A Graphical Approach to Teaching Simulation.

    ERIC Educational Resources Information Center

    Schruben, Lee W.

    1992-01-01

    SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…

  12. Ant system: optimization by a colony of cooperating agents.

    PubMed

    Dorigo, M; Maniezzo, V; Colorni, A

    1996-01-01

    An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call ant system (AS). We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed methodology to the classical traveling salesman problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling. Finally we discuss the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.

  13. The Role of Rendering in the Competence Project in Measurement Science for Optical Reflection and Scattering

    PubMed Central

    Westlund, Harold B.; Meyer, Gary W.; Hunt, Fern Y.

    2002-01-01

    Computer rendering is used to simulate the appearance of lighted objects for applications in architectural design, for animation and simulation in the entertainment industry, and for display and design in the automobile industry. Rapid advances in computer graphics technology suggest that in the near future it will be possible to produce photorealistic images of coated surfaces from scattering data. This could enable the identification of important parameters in the coatings manufacturing process that lead to desirable appearance, and to the design of virtual surfaces by visualizing prospective coating formulations once their optical properties are known. Here we report the results of our work to produce visually and radiometrically accurate renderings of selected appearance attributes of sample coated surfaces. It required changes in the rendering programs, which in general are not designed to accept high quality optical and material measurements, and changes in the optical measurement protocols. An outcome of this research is that some current ASTM standards can be replaced or enhanced by computer based standards of appearance. PMID:27446729

  14. Application of technology developed for flight simulation at NASA. Langley Research Center

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1991-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.

  15. Structural anomaly and dynamic heterogeneity in cycloether/water binary mixtures: Signatures from composition dependent dynamic fluorescence measurements and computer simulations

    NASA Astrophysics Data System (ADS)

    Indra, Sandipa; Guchhait, Biswajit; Biswas, Ranjit

    2016-03-01

    We have performed steady state UV-visible absorption and time-resolved fluorescence measurements and computer simulations to explore the cosolvent mole fraction induced changes in structural and dynamical properties of water/dioxane (Diox) and water/tetrahydrofuran (THF) binary mixtures. Diox is a quadrupolar solvent whereas THF is a dipolar one although both are cyclic molecules and represent cycloethers. The focus here is on whether these cycloethers can induce stiffening and transition of water H-bond network structure and, if they do, whether such structural modification differentiates the chemical nature (dipolar or quadrupolar) of the cosolvent molecules. Composition dependent measured fluorescence lifetimes and rotation times of a dissolved dipolar solute (Coumarin 153, C153) suggest cycloether mole-fraction (XTHF/Diox) induced structural transition for both of these aqueous binary mixtures in the 0.1 ≤ XTHF/Diox ≤ 0.2 regime with no specific dependence on the chemical nature. Interestingly, absorption measurements reveal stiffening of water H-bond structure in the presence of both the cycloethers at a nearly equal mole-fraction, XTHF/Diox ˜ 0.05. Measurements near the critical solution temperature or concentration indicate no role for the solution criticality on the anomalous structural changes. Evidences for cycloether aggregation at very dilute concentrations have been found. Simulated radial distribution functions reflect abrupt changes in respective peak heights at those mixture compositions around which fluorescence measurements revealed structural transition. Simulated water coordination numbers (for a dissolved C153) and number of H-bonds also exhibit minima around these cosolvent concentrations. In addition, several dynamic heterogeneity parameters have been simulated for both the mixtures to explore the effects of structural transition and chemical nature of cosolvent on heterogeneous dynamics of these systems. Simulated four-point dynamic susceptibility suggests formation of clusters inducing local heterogeneity in the solution structure.

  16. Neurosurgical Virtual Reality Simulation for Brain Tumor Using High-definition Computer Graphics: A Review of the Literature.

    PubMed

    Kin, Taichi; Nakatomi, Hirofumi; Shono, Naoyuki; Nomura, Seiji; Saito, Toki; Oyama, Hiroshi; Saito, Nobuhito

    2017-10-15

    Simulation and planning of surgery using a virtual reality model is becoming common with advances in computer technology. In this study, we conducted a literature search to find trends in virtual simulation of surgery for brain tumors. A MEDLINE search for "neurosurgery AND (simulation OR virtual reality)" retrieved a total of 1,298 articles published in the past 10 years. After eliminating studies designed solely for education and training purposes, 28 articles about the clinical application remained. The finding that the vast majority of the articles were about education and training rather than clinical applications suggests that several issues need be addressed for clinical application of surgical simulation. In addition, 10 of the 28 articles were from Japanese groups. In general, the 28 articles demonstrated clinical benefits of virtual surgical simulation. Simulation was particularly useful in better understanding complicated spatial relations of anatomical landmarks and in examining surgical approaches. In some studies, Virtual reality models were used on either surgical navigation system or augmented reality technology, which projects virtual reality images onto the operating field. Reported problems were difficulties in standardized, objective evaluation of surgical simulation systems; inability to respond to tissue deformation caused by surgical maneuvers; absence of the system functionality to reflect features of tissue (e.g., hardness and adhesion); and many problems with image processing. The amount of description about image processing tended to be insufficient, indicating that the level of evidence, risk of bias, precision, and reproducibility need to be addressed for further advances and ultimately for full clinical application.

  17. Criticality of the random field Ising model in and out of equilibrium: A nonperturbative functional renormalization group description

    NASA Astrophysics Data System (ADS)

    Balog, Ivan; Tarjus, Gilles; Tissier, Matthieu

    2018-03-01

    We show that, contrary to previous suggestions based on computer simulations or erroneous theoretical treatments, the critical points of the random-field Ising model out of equilibrium, when quasistatically changing the applied source at zero temperature, and in equilibrium are not in the same universality class below some critical dimension dD R≈5.1 . We demonstrate this by implementing a nonperturbative functional renormalization group for the associated dynamical field theory. Above dD R, the avalanches, which characterize the evolution of the system at zero temperature, become irrelevant at large distance, and hysteresis and equilibrium critical points are then controlled by the same fixed point. We explain how to use computer simulation and finite-size scaling to check the correspondence between in and out of equilibrium criticality in a far less ambiguous way than done so far.

  18. Computer simulations of structural transitions in large ferrofluid aggregates

    NASA Astrophysics Data System (ADS)

    Yoon, Mina; Tomanek, David

    2003-03-01

    We have developed a quaternion molecular dynamics formalism to study structural transitions in systems of ferrofluid particles in colloidal suspensions. Our approach takes advantage of the viscous damping provided by the surrounding liquid and enables us to study the time evolution of these systems over milli-second time periods as a function of the number of particles, initial geometry, and an externally applied magnetic field. Our computer simulations for aggregates containing tens to hundreds of ferrofluid particles suggest that these systems relax to the global optimum structure in a step-wise manner. During the relaxation process, the potential energy decreases by two mechanisms, which occur on different time scales. Short time periods associated with structural relaxations within a given morphology are followed by much slower processes that generally lead to a simpler morphology. We discuss possible applications of these externally driven structural transitions for targeted medication delivery.

  19. Influence of system size on the properties of a fluid adsorbed in a nanopore: Physical manifestations and methodological consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puibasset, Joël, E-mail: puibasset@cnrs-orleans.fr; Kierlik, Edouard, E-mail: edouard.kierlik@upmc.fr; Tarjus, Gilles, E-mail: tarjus@lptl.jussieu.fr

    Hysteresis and discontinuities in the isotherms of a fluid adsorbed in a nanopore in general hamper the determination of equilibrium thermodynamic properties, even in computer simulations. A way around this has been to consider both a reservoir of small size and a pore of small extent in order to restrict the fluctuations of density and approach a classical van der Waals loop. We assess this suggestion by thoroughly studying through Monte Carlo simulations and density functional theory the influence of system size on the equilibrium configurations of the adsorbed fluid and on the resulting isotherms. We stress the importance ofmore » pore-symmetry-breaking states that even for modest pore sizes lead to discontinuous isotherms and we discuss the physical relevance of these states and the methodological consequences for computing thermodynamic quantities.« less

  20. Computational model of polarized actin cables and cytokinetic actin ring formation in budding yeast

    PubMed Central

    Tang, Haosu; Bidone, Tamara C.

    2015-01-01

    The budding yeast actin cables and contractile ring are important for polarized growth and division, revealing basic aspects of cytoskeletal function. To study these formin-nucleated structures, we built a 3D computational model with actin filaments represented as beads connected by springs. Polymerization by formins at the bud tip and bud neck, crosslinking, severing, and myosin pulling, are included. Parameter values were estimated from prior experiments. The model generates actin cable structures and dynamics similar to those of wild type and formin deletion mutant cells. Simulations with increased polymerization rate result in long, wavy cables. Simulated pulling by type V myosin stretches actin cables. Increasing the affinity of actin filaments for the bud neck together with reduced myosin V pulling promotes the formation of a bundle of antiparallel filaments at the bud neck, which we suggest as a model for the assembly of actin filaments to the contractile ring. PMID:26538307

  1. Video games and surgical ability: a literature review.

    PubMed

    Lynch, Jeremy; Aughwane, Paul; Hammond, Toby M

    2010-01-01

    Surgical training is rapidly evolving because of reduced training hours and the reduction of training opportunities due to patient safety concerns. There is a popular conception that video game usage might be linked to improved operating ability especially those techniques involving endoscopic modalities. If true this might suggest future directions for training. A search was made of the MEDLINE databases for the MeSH term, "Video Games," combined with the terms "Surgical Procedures, Operative," "Endoscopy," "Robotics," "Education," "Learning," "Simulators," "Computer Simulation," "Psychomotor Performance," and "Surgery, Computer-Assisted,"encompassing all journal articles before November 2009. References of articles were searched for further studies. Twelve relevant journal articles were discovered. Video game usage has been studied in relationship to laparoscopic, gastrointestinal endoscopic, endovascular, and robotic surgery. Video game users acquire endoscopic but not robotic techniques quicker, and training on video games appears to improve performance. Copyright (c) 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  2. Lipid Biomembrane in Ionic Liquids

    NASA Astrophysics Data System (ADS)

    Yoo, Brian; Jing, Benxin; Shah, Jindal; Maginn, Ed; Zhu, Y. Elaine; Department of Chemical and Biomolecular Engineering Team

    2014-03-01

    Ionic liquids (ILs) have been recently explored as new ``green'' chemicals in several chemical and biomedical processes. In our pursuit of understanding their toxicities towards aquatic and terrestrial organisms, we have examined the IL interaction with lipid bilayers as model cell membranes. Experimentally by fluorescence microscopy, we have directly observed the disruption of lipid bilayer by added ILs. Depending on the concentration, alkyl chain length, and anion hydrophobicity of ILs, the interaction of ILs with lipid bilayers leads to the formation of micelles, fibrils, and multi-lamellar vesicles for IL-lipid complexes. By MD computer simulations, we have confirmed the insertion of ILs into lipid bilayers to modify the spatial organization of lipids in the membrane. The combined experimental and simulation results correlate well with the bioassay results of IL-induced suppression in bacteria growth, thereby suggesting a possible mechanism behind the IL toxicity. National Science Foundation, Center for Research Computing at Notre Dame.

  3. Conformational Heterogeneity of Bax Helix 9 Dimer for Apoptotic Pore Formation

    NASA Astrophysics Data System (ADS)

    Liao, Chenyi; Zhang, Zhi; Kale, Justin; Andrews, David W.; Lin, Jialing; Li, Jianing

    2016-07-01

    Helix α9 of Bax protein can dimerize in the mitochondrial outer membrane (MOM) and lead to apoptotic pores. However, it remains unclear how different conformations of the dimer contribute to the pore formation on the molecular level. Thus we have investigated various conformational states of the α9 dimer in a MOM model — using computer simulations supplemented with site-specific mutagenesis and crosslinking of the α9 helices. Our data not only confirmed the critical membrane environment for the α9 stability and dimerization, but also revealed the distinct lipid-binding preference of the dimer in different conformational states. In our proposed pathway, a crucial iso-parallel dimer that mediates the conformational transition was discovered computationally and validated experimentally. The corroborating evidence from simulations and experiments suggests that, helix α9 assists Bax activation via the dimer heterogeneity and interactions with specific MOM lipids, which eventually facilitate proteolipidic pore formation in apoptosis regulation.

  4. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  5. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  6. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  7. Analytic variance estimates of Swank and Fano factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Benjamin; Badano, Aldo; Samuelson, Frank, E-mail: frank.samuelson@fda.hhs.gov

    Purpose: Variance estimates for detector energy resolution metrics can be used as stopping criteria in Monte Carlo simulations for the purpose of ensuring a small uncertainty of those metrics and for the design of variance reduction techniques. Methods: The authors derive an estimate for the variance of two energy resolution metrics, the Swank factor and the Fano factor, in terms of statistical moments that can be accumulated without significant computational overhead. The authors examine the accuracy of these two estimators and demonstrate how the estimates of the coefficient of variation of the Swank and Fano factors behave with data frommore » a Monte Carlo simulation of an indirect x-ray imaging detector. Results: The authors' analyses suggest that the accuracy of their variance estimators is appropriate for estimating the actual variances of the Swank and Fano factors for a variety of distributions of detector outputs. Conclusions: The variance estimators derived in this work provide a computationally convenient way to estimate the error or coefficient of variation of the Swank and Fano factors during Monte Carlo simulations of radiation imaging systems.« less

  8. A Component-Based FPGA Design Framework for Neuronal Ion Channel Dynamics Simulations

    PubMed Central

    Mak, Terrence S. T.; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang

    2008-01-01

    Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field Programmable Gate Array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the AMPA and NMDA synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired. PMID:17190033

  9. Water immersion and its computer simulation as analogs of weightlessness

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1982-01-01

    Experimental studies and computer simulations of water immersion are summarized and discussed with regard to their utility as analogs of weightlessness. Emphasis is placed on describing and interpreting the renal, endocrine, fluid, and circulatory changes that take place during immersion. A mathematical model, based on concepts of fluid volume regulation, is shown to be well suited to simulate the dynamic responses to water immersion. Further, it is shown that such a model provides a means to study specific mechanisms and pathways involved in the immersion response. A number of hypotheses are evaluated with the model related to the effects of dehydration, venous pressure disturbances, the control of ADH, and changes in plasma-interstitial volume. By inference, it is suggested that most of the model's responses to water immersion are plausible predictions of the acute changes expected, but not yet measured, during space flight. One important prediction of the model is that previous attempts to measure a diuresis during space flight failed because astronauts may have been dehydrated and urine samples were pooled over 24-hour periods.

  10. EnergyPlus Run Time Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Buhl, Fred; Haves, Philip

    2008-09-20

    EnergyPlus is a new generation building performance simulation program offering many new modeling capabilities and more accurate performance calculations integrating building components in sub-hourly time steps. However, EnergyPlus runs much slower than the current generation simulation programs. This has become a major barrier to its widespread adoption by the industry. This paper analyzed EnergyPlus run time from comprehensive perspectives to identify key issues and challenges of speeding up EnergyPlus: studying the historical trends of EnergyPlus run time based on the advancement of computers and code improvements to EnergyPlus, comparing EnergyPlus with DOE-2 to understand and quantify the run time differences,more » identifying key simulation settings and model features that have significant impacts on run time, and performing code profiling to identify which EnergyPlus subroutines consume the most amount of run time. This paper provides recommendations to improve EnergyPlus run time from the modeler?s perspective and adequate computing platforms. Suggestions of software code and architecture changes to improve EnergyPlus run time based on the code profiling results are also discussed.« less

  11. Imaging performance of a hybrid x-ray computed tomography-fluorescence molecular tomography system using priors.

    PubMed

    Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis

    2010-05-01

    The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.

  12. GPUs, a New Tool of Acceleration in CFD: Efficiency and Reliability on Smoothed Particle Hydrodynamics Methods

    PubMed Central

    Crespo, Alejandro C.; Dominguez, Jose M.; Barreiro, Anxo; Gómez-Gesteira, Moncho; Rogers, Benedict D.

    2011-01-01

    Smoothed Particle Hydrodynamics (SPH) is a numerical method commonly used in Computational Fluid Dynamics (CFD) to simulate complex free-surface flows. Simulations with this mesh-free particle method far exceed the capacity of a single processor. In this paper, as part of a dual-functioning code for either central processing units (CPUs) or Graphics Processor Units (GPUs), a parallelisation using GPUs is presented. The GPU parallelisation technique uses the Compute Unified Device Architecture (CUDA) of nVidia devices. Simulations with more than one million particles on a single GPU card exhibit speedups of up to two orders of magnitude over using a single-core CPU. It is demonstrated that the code achieves different speedups with different CUDA-enabled GPUs. The numerical behaviour of the SPH code is validated with a standard benchmark test case of dam break flow impacting on an obstacle where good agreement with the experimental results is observed. Both the achieved speed-ups and the quantitative agreement with experiments suggest that CUDA-based GPU programming can be used in SPH methods with efficiency and reliability. PMID:21695185

  13. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  14. Development of an E-Prime Based Computer Simulation of an Interactive Human Rights Violation Negotiation Script (Developpement d’un Programme de Simulation par Ordinateur Fonde sur le Logiciel E Prime pour la Negociation Interactive en cas de Violation des Droits de la Personne)

    DTIC Science & Technology

    2010-12-01

    Base ( CFB ) Kingston. The computer simulation developed in this project is intended to be used for future research and as a possible training platform...DRDC Toronto No. CR 2010-055 Development of an E-Prime based computer simulation of an interactive Human Rights Violation negotiation script...Abstract This report describes the method of developing an E-Prime computer simulation of an interactive Human Rights Violation (HRV) negotiation. An

  15. Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock

    This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less

  16. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  17. Local-feature analysis for automated coarse-graining of bulk-polymer molecular dynamics simulations.

    PubMed

    Xue, Y; Ludovice, P J; Grover, M A

    2012-12-01

    A method for automated coarse-graining of bulk polymers is presented, using the data-mining tool of local feature analysis. Most existing methods for polymer coarse-graining define superatoms based on their covalent bonding topology along the polymer backbone, but here superatoms are defined based only on their correlated motions, as observed in molecular dynamics simulations. Correlated atomic motions are identified in the simulation data using local feature analysis, between atoms in the same or in different polymer chains. Groups of highly correlated atoms constitute the superatoms in the coarse-graining scheme, and the positions of their seed coordinates are then projected forward in time. Based on only the seed positions, local feature analysis enables the full reconstruction of all atomic positions. This reconstruction suggests an iterative scheme to reduce the computation of the simulations to initialize another short molecular dynamic simulation, identify new superatoms, and again project forward in time.

  18. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    NASA Astrophysics Data System (ADS)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  19. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  20. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    PubMed

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Testing the role of reward and punishment sensitivity in avoidance behavior: a computational modeling approach

    PubMed Central

    Sheynin, Jony; Moustafa, Ahmed A.; Beck, Kevin D.; Servatius, Richard J.; Myers, Catherine E.

    2015-01-01

    Exaggerated avoidance behavior is a predominant symptom in all anxiety disorders and its degree often parallels the development and persistence of these conditions. Both human and non-human animal studies suggest that individual differences as well as various contextual cues may impact avoidance behavior. Specifically, we have recently shown that female sex and inhibited temperament, two anxiety vulnerability factors, are associated with greater duration and rate of the avoidance behavior, as demonstrated on a computer-based task closely related to common rodent avoidance paradigms. We have also demonstrated that avoidance is attenuated by the administration of explicit visual signals during “non-threat” periods (i.e., safety signals). Here, we use a reinforcement-learning network model to investigate the underlying mechanisms of these empirical findings, with a special focus on distinct reward and punishment sensitivities. Model simulations suggest that sex and inhibited temperament are associated with specific aspects of these sensitivities. Specifically, differences in relative sensitivity to reward and punishment might underlie the longer avoidance duration demonstrated by females, whereas higher sensitivity to punishment might underlie the higher avoidance rate demonstrated by inhibited individuals. Simulations also suggest that safety signals attenuate avoidance behavior by strengthening the competing approach response. Lastly, several predictions generated by the model suggest that extinction-based cognitive-behavioral therapies might benefit from the use of safety signals, especially if given to individuals with high reward sensitivity and during longer safe periods. Overall, this study is the first to suggest cognitive mechanisms underlying the greater avoidance behavior observed in healthy individuals with different anxiety vulnerabilities. PMID:25639540

  2. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  3. Computer Simulation in Undergraduate Instruction: A Symposium.

    ERIC Educational Resources Information Center

    Street, Warren R.; And Others

    These symposium papers discuss the instructional use of computers in psychology, with emphasis on computer-produced simulations. The first, by Rich Edwards, briefly outlines LABSIM, a general purpose system of FORTRAN programs which simulate data collection in more than a dozen experimental models in psychology and are designed to train students…

  4. Overview of Computer Simulation Modeling Approaches and Methods

    Treesearch

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  5. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  6. Democratic population decisions result in robust policy-gradient learning: a parametric study with GPU simulations.

    PubMed

    Richmond, Paul; Buesing, Lars; Giugliano, Michele; Vasilaki, Eleni

    2011-05-04

    High performance computing on the Graphics Processing Unit (GPU) is an emerging field driven by the promise of high computational power at a low cost. However, GPU programming is a non-trivial task and moreover architectural limitations raise the question of whether investing effort in this direction may be worthwhile. In this work, we use GPU programming to simulate a two-layer network of Integrate-and-Fire neurons with varying degrees of recurrent connectivity and investigate its ability to learn a simplified navigation task using a policy-gradient learning rule stemming from Reinforcement Learning. The purpose of this paper is twofold. First, we want to support the use of GPUs in the field of Computational Neuroscience. Second, using GPU computing power, we investigate the conditions under which the said architecture and learning rule demonstrate best performance. Our work indicates that networks featuring strong Mexican-Hat-shaped recurrent connections in the top layer, where decision making is governed by the formation of a stable activity bump in the neural population (a "non-democratic" mechanism), achieve mediocre learning results at best. In absence of recurrent connections, where all neurons "vote" independently ("democratic") for a decision via population vector readout, the task is generally learned better and more robustly. Our study would have been extremely difficult on a desktop computer without the use of GPU programming. We present the routines developed for this purpose and show that a speed improvement of 5x up to 42x is provided versus optimised Python code. The higher speed is achieved when we exploit the parallelism of the GPU in the search of learning parameters. This suggests that efficient GPU programming can significantly reduce the time needed for simulating networks of spiking neurons, particularly when multiple parameter configurations are investigated.

  7. Computer modeling and simulation of human movement. Applications in sport and rehabilitation.

    PubMed

    Neptune, R R

    2000-05-01

    Computer modeling and simulation of human movement plays an increasingly important role in sport and rehabilitation, with applications ranging from sport equipment design to understanding pathologic gait. The complex dynamic interactions within the musculoskeletal and neuromuscular systems make analyzing human movement with existing experimental techniques difficult but computer modeling and simulation allows for the identification of these complex interactions and causal relationships between input and output variables. This article provides an overview of computer modeling and simulation and presents an example application in the field of rehabilitation.

  8. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  9. Generalized dynamic engine simulation techniques for the digital computer

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1974-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.

  10. Generalized dynamic engine simulation techniques for the digital computer

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1974-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.

  11. Generalized dynamic engine simulation techniques for the digital computers

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1975-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar digital programs on future engine simulation philosophy is also discussed.

  12. Fast Learning for Immersive Engagement in Energy Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less

  13. Universal quantum computing using (Zd) 3 symmetry-protected topologically ordered states

    NASA Astrophysics Data System (ADS)

    Chen, Yanzhu; Prakash, Abhishodh; Wei, Tzu-Chieh

    2018-02-01

    Measurement-based quantum computation describes a scheme where entanglement of resource states is utilized to simulate arbitrary quantum gates via local measurements. Recent works suggest that symmetry-protected topologically nontrivial, short-ranged entangled states are promising candidates for such a resource. Miller and Miyake [npj Quantum Inf. 2, 16036 (2016), 10.1038/npjqi.2016.36] recently constructed a particular Z2×Z2×Z2 symmetry-protected topological state on the Union Jack lattice and established its quantum-computational universality. However, they suggested that the same construction on the triangular lattice might not lead to a universal resource. Instead of qubits, we generalize the construction to qudits and show that the resulting (d -1 ) qudit nontrivial Zd×Zd×Zd symmetry-protected topological states are universal on the triangular lattice, for d being a prime number greater than 2. The same construction also holds for other 3-colorable lattices, including the Union Jack lattice.

  14. Gauge Conditions for Moving Black Holes Without Excision

    NASA Technical Reports Server (NTRS)

    van Meter, James; Baker, John G.; Koppitz, Michael; Dae-IL, Choi

    2006-01-01

    Recent demonstrations of unexcised, puncture black holes traversing freely across computational grids represent a significant advance in numerical relativity. Stable an$ accurate simulations of multiple orbits, and their radiated waves, result. This capability is critically undergirded by a careful choice of gauge. Here we present analytic considerations which suggest certain gauge choices, and numerically demonstrate their efficacy in evolving a single moving puncture.

  15. Sensitivity analysis of a pulse nutrient addition technique for estimating nutrient uptake in large streams

    Treesearch

    Laurence Lin; J.R. Webster

    2012-01-01

    The constant nutrient addition technique has been used extensively to measure nutrient uptake in streams. However, this technique is impractical for large streams, and the pulse nutrient addition (PNA) has been suggested as an alternative. We developed a computer model to simulate Monod kinetics nutrient uptake in large rivers and used this model to evaluate the...

  16. Computer simulation for optimizing windbreak placement to save energy for heating and cooling buildings

    Treesearch

    Gordon M. Heisler

    1991-01-01

    Saving energy has recently acquired new importance because of increased concern for dwindling fossil fuel supplies and for the problem of carbon dioxide contributions to global climate change. Many studies have indicated that windbreaks have the ability to save energy for heating buildings. Suggested savings have ranged up 40 percent; though more commonly savings of...

  17. Choosing a Transformation in Analyses of Insect Counts from Contagious Distributions with Low Means

    Treesearch

    W.D. Pepper; S.J. Zarnoch; G.L. DeBarr; P. de Groot; C.D. Tangren

    1997-01-01

    Guidelines based on computer simulation are suggested for choosing a transformation of insect counts from negative binomial distributions with low mean counts and high levels of contagion. Typical values and ranges of negative binomial model parameters were determined by fitting the model to data from 19 entomological field studies. Random sampling of negative binomial...

  18. Coordinated crew performance in commercial aircraft operations

    NASA Technical Reports Server (NTRS)

    Murphy, M. R.

    1977-01-01

    A specific methodology is proposed for an improved system of coding and analyzing crew member interaction. The complexity and lack of precision of many crew and task variables suggest the usefulness of fuzzy linguistic techniques for modeling and computer simulation of the crew performance process. Other research methodologies and concepts that have promise for increasing the effectiveness of research on crew performance are identified.

  19. A Primer on Simulation and Gaming.

    ERIC Educational Resources Information Center

    Barton, Richard F.

    In a primer intended for the administrative professions, for the behavioral sciences, and for education, simulation and its various aspects are defined, illustrated, and explained. Man-model simulation, man-computer simulation, all-computer simulation, and analysis are discussed as techniques for studying object systems (parts of the "real…

  20. A diffusion based study of population dynamics: Prehistoric migrations into South Asia

    PubMed Central

    Vahia, Mayank N.; Yadav, Nisha; Ladiwala, Uma; Mathur, Deepak

    2017-01-01

    A diffusion equation has been used to study migration of early humans into the South Asian subcontinent. The diffusion equation is tempered by a set of parameters that account for geographical features like proximity to water resources, altitude, and flatness of land. The ensuing diffusion of populations is followed in time-dependent computer simulations carried out over a period of 10,000 YBP. The geographical parameters are determined from readily-available satellite data. The results of our computer simulations are compared to recent genetic data so as to better correlate the migratory patterns of various populations; they suggest that the initial populations started to coalesce around 4,000 YBP before the commencement of a period of relative geographical isolation of each population group. The period during which coalescence of populations occurred appears consistent with the established timeline associated with the Harappan civilization and also, with genetic admixing that recent genetic mapping data reveal. Our results may contribute to providing a timeline for the movement of prehistoric people. Most significantly, our results appear to suggest that the Ancestral Austro-Asiatic population entered the subcontinent through an easterly direction, potentially resolving a hitherto-contentious issue. PMID:28493906

  1. A computational microscopy study of nanostructural evolution in irradiated pressure vessel steels

    NASA Astrophysics Data System (ADS)

    Odette, G. R.; Wirth, B. D.

    1997-11-01

    Nanostructural features that form in reactor pressure vessel steels under neutron irradiation at around 300°C lead to significant hardening and embrittlement. Continuum thermodynamic-kinetic based rate theories have been very successful in modeling the general characteristics of the copper and manganese nickel rich precipitate evolution, often the dominant source of embrittlement. However, a more detailed atomic scale understanding of these features is needed to interpret experimental measurements and better underpin predictive embrittlement models. Further, other embrittling features, believed to be subnanometer defect (vacancy)-solute complexes and small regions of modest enrichment of solutes are not well understood. A general approach to modeling embrittlement nanostructures, based on the concept of a computational microscope, is described. The objective of the computational microscope is to self-consistently integrate atomic scale simulations with other sources of information, including a wide range of experiments. In this work, lattice Monte Carlo (LMC) simulations are used to resolve the chemically and structurally complex nature of CuMnNiSi precipitates. The LMC simulations unify various nanoscale analytical characterization methods and basic thermodynamics. The LMC simulations also reveal that significant coupled vacancy and solute clustering takes place during cascade aging. The cascade clustering produces the metastable vacancy-cluster solute complexes that mediate flux effects. Cascade solute clustering may also play a role in the formation of dilute atmospheres of solute enrichment and enhance the nucleation of manganese-nickel rich precipitates at low Cu levels. Further, the simulations suggest that complex, highly correlated processes (e.g. cluster diffusion, formation of favored vacancy diffusion paths and solute scavenging vacancy cluster complexes) may lead to anomalous fast thermal aging kinetics at temperatures below about 450°C. The potential technical significance of these phenomena is described.

  2. Computer simulations of alkali-acetate solutions: Accuracy of the forcefields in difference concentrations

    NASA Astrophysics Data System (ADS)

    Ahlstrand, Emma; Zukerman Schpector, Julio; Friedman, Ran

    2017-11-01

    When proteins are solvated in electrolyte solutions that contain alkali ions, the ions interact mostly with carboxylates on the protein surface. Correctly accounting for alkali-carboxylate interactions is thus important for realistic simulations of proteins. Acetates are the simplest carboxylates that are amphipathic, and experimental data for alkali acetate solutions are available and can be compared with observables obtained from simulations. We carried out molecular dynamics simulations of alkali acetate solutions using polarizable and non-polarizable forcefields and examined the ion-acetate interactions. In particular, activity coefficients and association constants were studied in a range of concentrations (0.03, 0.1, and 1M). In addition, quantum-mechanics (QM) based energy decomposition analysis was performed in order to estimate the contribution of polarization, electrostatics, dispersion, and QM (non-classical) effects on the cation-acetate and cation-water interactions. Simulations of Li-acetate solutions in general overestimated the binding of Li+ and acetates. In lower concentrations, the activity coefficients of alkali-acetate solutions were too high, which is suggested to be due to the simulation protocol and not the forcefields. Energy decomposition analysis suggested that improvement of the forcefield parameters to enable accurate simulations of Li-acetate solutions can be achieved but may require the use of a polarizable forcefield. Importantly, simulations with some ion parameters could not reproduce the correct ion-oxygen distances, which calls for caution in the choice of ion parameters when protein simulations are performed in electrolyte solutions.

  3. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    NASA Astrophysics Data System (ADS)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  4. Numerical simulation of a combined oxidation ditch flow using 3D k-epsilon turbulence model.

    PubMed

    Luo, Lin; Li, Wei-min; Deng, Yong-sen; Wang, Tao

    2005-01-01

    The standard three dimensional(3D) k-epsilon turbulence model was applied to simulate the flow field of a small scale combined oxidation ditch. The moving mesh approach was used to model the rotor of the ditch. Comparison of the computed and the measured data is acceptable. A vertical reverse flow zone in the ditch was found, and it played a very important role in the ditch flow behavior. The flow pattern in the ditch is discussed in detail, and approaches are suggested to improve the hydrodynamic performance in the ditch.

  5. Electromagnetic Modelling of MMIC CPWs for High Frequency Applications

    NASA Astrophysics Data System (ADS)

    Sinulingga, E. P.; Kyabaggu, P. B. K.; Rezazadeh, A. A.

    2018-02-01

    Realising the theoretical electrical characteristics of components through modelling can be carried out using computer-aided design (CAD) simulation tools. If the simulation model provides the expected characteristics, the fabrication process of Monolithic Microwave Integrated Circuit (MMIC) can be performed for experimental verification purposes. Therefore improvements can be suggested before mass fabrication takes place. This research concentrates on development of MMIC technology by providing accurate predictions of the characteristics of MMIC components using an improved Electromagnetic (EM) modelling technique. The knowledge acquired from the modelling and characterisation process in this work can be adopted by circuit designers for various high frequency applications.

  6. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  7. Effect of computer game playing on baseline laparoscopic simulator skills.

    PubMed

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  8. Problems with sampling desert tortoises: A simulation analysis based on field data

    USGS Publications Warehouse

    Freilich, J.E.; Camp, R.J.; Duda, J.J.; Karl, A.E.

    2005-01-01

    The desert tortoise (Gopherus agassizii) was listed as a U.S. threatened species in 1990 based largely on population declines inferred from mark-recapture surveys of 2.59-km2 (1-mi2) plots. Since then, several census methods have been proposed and tested, but all methods still pose logistical or statistical difficulties. We conducted computer simulations using actual tortoise location data from 2 1-mi2 plot surveys in southern California, USA, to identify strengths and weaknesses of current sampling strategies. We considered tortoise population estimates based on these plots as "truth" and then tested various sampling methods based on sampling smaller plots or transect lines passing through the mile squares. Data were analyzed using Schnabel's mark-recapture estimate and program CAPTURE. Experimental subsampling with replacement of the 1-mi2 data using 1-km2 and 0.25-km2 plot boundaries produced data sets of smaller plot sizes, which we compared to estimates from the 1-mi 2 plots. We also tested distance sampling by saturating a 1-mi 2 site with computer simulated transect lines, once again evaluating bias in density estimates. Subsampling estimates from 1-km2 plots did not differ significantly from the estimates derived at 1-mi2. The 0.25-km2 subsamples significantly overestimated population sizes, chiefly because too few recaptures were made. Distance sampling simulations were biased 80% of the time and had high coefficient of variation to density ratios. Furthermore, a prospective power analysis suggested limited ability to detect population declines as high as 50%. We concluded that poor performance and bias of both sampling procedures was driven by insufficient sample size, suggesting that all efforts must be directed to increasing numbers found in order to produce reliable results. Our results suggest that present methods may not be capable of accurately estimating desert tortoise populations.

  9. Eigenvector method for umbrella sampling enables error analysis

    PubMed Central

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-01-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence. PMID:27586912

  10. Substellar fragmentation in self-gravitating fluids with a major phase transition

    NASA Astrophysics Data System (ADS)

    Füglistaler, A.; Pfenniger, D.

    2015-06-01

    Context. The observation of various ices in cold molecular clouds, the existence of ubiquitous substellar, cold H2 globules in planetary nebulae and supernova remnants, or the mere existence of comets suggest that the physics of very cold interstellar gas might be much richer than usually envisioned. At the extreme of low temperatures (≲10 K), H2 itself is subject to a phase transition crossing the entire cosmic gas density scale. Aims: This well-known, laboratory-based fact motivates us to study the ideal case of a cold neutral gaseous medium in interstellar conditions for which the bulk of the mass, instead of trace elements, is subject to a gas-liquid or gas-solid phase transition. Methods: On the one hand, the equilibrium of general non-ideal fluids is studied using the virial theorem and linear stability analysis. On the other hand, the non-linear dynamics is studied using computer simulations to characterize the expected formation of solid bodies analogous to comets. The simulations are run with a state-of-the-art molecular dynamics code (LAMMPS) using the Lennard-Jones inter-molecular potential. The long-range gravitational forces can be taken into account together with short-range molecular forces with finite limited computational resources, using super-molecules, provided the right scaling is followed. Results: The concept of super-molecule, where the phase transition conditions are preserved by the proper choice of the particle parameters, is tested with computer simulations, allowing us to correctly satisfy the Jeans instability criterion for one-phase fluids. The simulations show that fluids presenting a phase transition are gravitationally unstable as well, independent of the strength of the gravitational potential, producing two distinct kinds of substellar bodies, those dominated by gravity (planetoids) and those dominated by molecular attractive force (comets). Conclusions: Observations, formal analysis, and computer simulations suggest the possibility of the formation of substellar H2 clumps in cold molecular clouds due to the combination of phase transition and gravity. Fluids presenting a phase transition are gravitationally unstable, independent of the strength of the gravitational potential. Arbitrarily small H2 clumps may form even at relatively high temperatures up to 400-600 K, according to virial analysis. The combination of phase transition and gravity may be relevant for a wider range of astrophysical situations, such as proto-planetary disks. Figures 33-44 are available in electronic form at http://www.aanda.org

  11. A Computer Simulation of Bacterial Growth During Food-Processing

    DTIC Science & Technology

    1974-11-01

    1 AD A TECHNICAL REPORT A COMPUTER SIMULATION OF BACTERIAL GROWTH DURING FOOD PROCESSING =r= by Edward W. Ross, Jr. Approved for public...COMPUTER SIMULATION OF BACTERIAL GROWTH DURING FOOD - PROCESSING Edward W. Ross, Jr. Army Natick Laboratories Natick, Massachusetts Novembe...CATALOG NUMBER 4. TITLE fand SubtKUJ "A Computer Sinulatlon of Bacterial Growth During Food - Processing " 5. TYPE OF REPORT A PERIOD COVERED 6

  12. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  13. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  14. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, David J.; Reynolds, Daniel R.

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  15. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE PAGES

    Gardner, David J.; Reynolds, Daniel R.

    2017-01-05

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  16. An Investigation of Computer-based Simulations for School Crises Management.

    ERIC Educational Resources Information Center

    Degnan, Edward; Bozeman, William

    2001-01-01

    Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)

  17. Problem-Solving in the Pre-Clinical Curriculum: The Uses of Computer Simulations.

    ERIC Educational Resources Information Center

    Michael, Joel A.; Rovick, Allen A.

    1986-01-01

    Promotes the use of computer-based simulations in the pre-clinical medical curriculum as a means of providing students with opportunities for problem solving. Describes simple simulations of skeletal muscle loads, complex simulations of major organ systems and comprehensive simulation models of the entire human body. (TW)

  18. Computer-Assisted Face Processing Instruction Improves Emotion Recognition, Mentalizing, and Social Skills in Students with ASD.

    PubMed

    Rice, Linda Marie; Wall, Carla Anne; Fogel, Adam; Shic, Frederick

    2015-07-01

    This study examined the extent to which a computer-based social skills intervention called FaceSay was associated with improvements in affect recognition, mentalizing, and social skills of school-aged children with Autism Spectrum Disorder (ASD). FaceSay offers students simulated practice with eye gaze, joint attention, and facial recognition skills. This randomized control trial included school-aged children meeting educational criteria for autism (N = 31). Results demonstrated that participants who received the intervention improved their affect recognition and mentalizing skills, as well as their social skills. These findings suggest that, by targeting face-processing skills, computer-based interventions may produce changes in broader cognitive and social-skills domains in a cost- and time-efficient manner.

  19. Computer considerations for real time simulation of a generalized rotor model

    NASA Technical Reports Server (NTRS)

    Howe, R. M.; Fogarty, L. E.

    1977-01-01

    Scaled equations were developed to meet requirements for real time computer simulation of the rotor system research aircraft. These equations form the basis for consideration of both digital and hybrid mechanization for real time simulation. For all digital simulation estimates of the required speed in terms of equivalent operations per second are developed based on the complexity of the equations and the required intergration frame rates. For both conventional hybrid simulation and hybrid simulation using time-shared analog elements the amount of required equipment is estimated along with a consideration of the dynamic errors. Conventional hybrid mechanization using analog simulation of those rotor equations which involve rotor-spin frequencies (this consititutes the bulk of the equations) requires too much analog equipment. Hybrid simulation using time-sharing techniques for the analog elements appears possible with a reasonable amount of analog equipment. All-digital simulation with affordable general-purpose computers is not possible because of speed limitations, but specially configured digital computers do have the required speed and consitute the recommended approach.

  20. Computational Modeling of Inflammation and Wound Healing

    PubMed Central

    Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram

    2013-01-01

    Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362

  1. Detached-Eddy Simulations of Separated Flow Around Wings With Ice Accretions: Year One Report

    NASA Technical Reports Server (NTRS)

    Choo, Yung K. (Technical Monitor); Thompson, David; Mogili, Prasad

    2004-01-01

    A computational investigation was performed to assess the effectiveness of Detached-Eddy Simulation (DES) as a tool for predicting icing effects. The AVUS code, a public domain flow solver, was employed to compute solutions for an iced wing configuration using DES and steady Reynolds Averaged Navier-Stokes (RANS) equation methodologies. The configuration was an extruded GLC305/944-ice shape section with a rectangular planform. The model was mounted between two walls so no tip effects were considered. The numerical results were validated by comparison with experimental data for the same configuration. The time-averaged DES computations showed some improvement in lift and drag results near stall when compared to steady RANS results. However, comparisons of the flow field details did not show the level of agreement suggested by the integrated quantities. Based on our results, we believe that DES may prove useful in a limited sense to provide analysis of iced wing configurations when there is significant flow separation, e.g., near stall, where steady RANS computations are demonstrably ineffective. However, more validation is needed to determine what role DES can play as part of an overall icing effects prediction strategy. We conclude the report with an assessment of existing computational tools for application to the iced wing problem and a discussion of issues that merit further study.

  2. Could the heat sink effect of blood flow inside large vessels protect the vessel wall from thermal damage during RF-assisted surgical resection?

    PubMed

    González-Suárez, Ana; Trujillo, Macarena; Burdío, Fernando; Andaluz, Anna; Berjano, Enrique

    2014-08-01

    To assess by means of computer simulations whether the heat sink effect inside a large vessel (portal vein) could protect the vessel wall from thermal damage close to an internally cooled electrode during radiofrequency (RF)-assisted resection. First,in vivo experiments were conducted to validate the computational model by comparing the experimental and computational thermal lesion shapes created around the vessels. Computer simulations were then carried out to study the effect of different factors such as device-tissue contact, vessel position, and vessel-device distance on temperature distributions and thermal lesion shapes near a large vessel, specifically the portal vein. The geometries of thermal lesions around the vessels in the in vivo experiments were in agreement with the computer results. The thermal lesion shape created around the portal vein was significantly modified by the heat sink effect in all the cases considered. Thermal damage to the portal vein wall was inversely related to the vessel-device distance. It was also more pronounced when the device-tissue contact surface was reduced or when the vessel was parallel to the device or perpendicular to its distal end (blade zone), the vessel wall being damaged at distances less than 4.25 mm. The computational findings suggest that the heat sink effect could protect the portal vein wall for distances equal to or greater than 5 mm, regardless of its position and distance with respect to the RF-based device.

  3. The impact of home computer use on children's activities and development.

    PubMed

    Subrahmanyam, K; Kraut, R E; Greenfield, P M; Gross, E F

    2000-01-01

    The increasing amount of time children are spending on computers at home and school has raised questions about how the use of computer technology may make a difference in their lives--from helping with homework to causing depression to encouraging violent behavior. This article provides an overview of the limited research on the effects of home computer use on children's physical, cognitive, and social development. Initial research suggests, for example, that access to computers increases the total amount of time children spend in front of a television or computer screen at the expense of other activities, thereby putting them at risk for obesity. At the same time, cognitive research suggests that playing computer games can be an important building block to computer literacy because it enhances children's ability to read and visualize images in three-dimensional space and track multiple images simultaneously. The limited evidence available also indicates that home computer use is linked to slightly better academic performance. The research findings are more mixed, however, regarding the effects on children's social development. Although little evidence indicates that the moderate use of computers to play games has a negative impact on children's friendships and family relationships, recent survey data show that increased use of the Internet may be linked to increases in loneliness and depression. Of most concern are the findings that playing violent computer games may increase aggressiveness and desensitize a child to suffering, and that the use of computers may blur a child's ability to distinguish real life from simulation. The authors conclude that more systematic research is needed in these areas to help parents and policymakers maximize the positive effects and to minimize the negative effects of home computers in children's lives.

  4. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  5. Computer Folding of RNA Tetraloops: Identification of Key Force Field Deficiencies.

    PubMed

    Kührová, Petra; Best, Robert B; Bottaro, Sandro; Bussi, Giovanni; Šponer, Jiří; Otyepka, Michal; Banáš, Pavel

    2016-09-13

    The computer-aided folding of biomolecules, particularly RNAs, is one of the most difficult challenges in computational structural biology. RNA tetraloops are fundamental RNA motifs playing key roles in RNA folding and RNA-RNA and RNA-protein interactions. Although state-of-the-art Molecular Dynamics (MD) force fields correctly describe the native state of these tetraloops as a stable free-energy basin on the microsecond time scale, enhanced sampling techniques reveal that the native state is not the global free energy minimum, suggesting yet unidentified significant imbalances in the force fields. Here, we tested our ability to fold the RNA tetraloops in various force fields and simulation settings. We employed three different enhanced sampling techniques, namely, temperature replica exchange MD (T-REMD), replica exchange with solute tempering (REST2), and well-tempered metadynamics (WT-MetaD). We aimed to separate problems caused by limited sampling from those due to force-field inaccuracies. We found that none of the contemporary force fields is able to correctly describe folding of the 5'-GAGA-3' tetraloop over a range of simulation conditions. We thus aimed to identify which terms of the force field are responsible for this poor description of TL folding. We showed that at least two different imbalances contribute to this behavior, namely, overstabilization of base-phosphate and/or sugar-phosphate interactions and underestimated stability of the hydrogen bonding interaction in base pairing. The first artifact stabilizes the unfolded ensemble, while the second one destabilizes the folded state. The former problem might be partially alleviated by reparametrization of the van der Waals parameters of the phosphate oxygens suggested by Case et al., while in order to overcome the latter effect we suggest local potentials to better capture hydrogen bonding interactions.

  6. Simulation Applications in Educational Leadership.

    ERIC Educational Resources Information Center

    Bozeman, William; Wright, Robert H.

    1995-01-01

    Explores the use of computer-based simulations using multimedia materials for a graduate course in school administration. Highlights include simulation applications in military and in business; educational simulations; the use of computers and other technology; production requirements and costs; and time required. (LRW)

  7. On the "Exchangeability" of Hands-On and Computer-Simulated Science Performance Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Rosenquist, Anders; Shavelson, Richard J.; Ruiz-Primo, Maria Araceli

    Inconsistencies in scores from computer-simulated and "hands-on" science performance assessments have led to questions about the exchangeability of these two methods in spite of the highly touted potential of computer-simulated performance assessment. This investigation considered possible explanations for students' inconsistent performances: (1)…

  8. Using PC Software To Enhance the Student's Ability To Learn the Exporting Process.

    ERIC Educational Resources Information Center

    Buckles, Tom A.; Lange, Irene

    This paper describes the advantages of using computer simulations in the classroom or managerial environment and the major premise and principal components of Export to Win!, a computer simulation used in international marketing seminars. A rationale for using computer simulations argues that they improve the quality of teaching by building…

  9. Unpacking Students' Conceptualizations through Haptic Feedback

    ERIC Educational Resources Information Center

    Magana, A. J.; Balachandran, S.

    2017-01-01

    While it is clear that the use of computer simulations has a beneficial effect on learning when compared to instruction without computer simulations, there is still room for improvement to fully realize their benefits for learning. Haptic technologies can fulfill the educational potential of computer simulations by adding the sense of touch.…

  10. Study of Near-Surface Models in Large-Eddy Simulations of a Neutrally Stratified Atmospheric Boundary Layer

    NASA Technical Reports Server (NTRS)

    Senocak, I.; Ackerman, A. S.; Kirkpatrick, M. P.; Stevens, D. E.; Mansour, N. N.

    2004-01-01

    Large-eddy simulation (LES) is a widely used technique in armospheric modeling research. In LES, large, unsteady, three dimensional structures are resolved and small structures that are not resolved on the computational grid are modeled. A filtering operation is applied to distinguish between resolved and unresolved scales. We present two near-surface models that have found use in atmospheric modeling. We also suggest a simpler eddy viscosity model that adopts Prandtl's mixing length model (Prandtl 1925) in the vicinity of the surface and blends with the dynamic Smagotinsky model (Germano et al, 1991) away from the surface. We evaluate the performance of these surface models by simulating a neutraly stratified atmospheric boundary layer.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Benjamin S.; Hamilton, Steven P.; Jarrett, Michael G.

    This report describes the performance improvements made to the VERA Core Simulator (VERA-CS) during FY2016. The development of the VERA Core Simulator has focused on the capability needed to deplete physical reactors and help solve various problems; this capability required the accurate simulation of many operating cycles of a nuclear power plant. The first section of this report introduces two test problems used to assess the run-time performance of VERA-CS using a source dated February 2016. The next section provides a brief overview of the major modifications made to decrease the computational cost. Following the descriptions of the major improvements,more » the run-time for each improvement is shown. Conclusions on the work are presented, and further follow-on performance improvements are suggested.« less

  12. Telehealth innovations in health education and training.

    PubMed

    Conde, José G; De, Suvranu; Hall, Richard W; Johansen, Edward; Meglan, Dwight; Peng, Grace C Y

    2010-01-01

    Telehealth applications are increasingly important in many areas of health education and training. In addition, they will play a vital role in biomedical research and research training by facilitating remote collaborations and providing access to expensive/remote instrumentation. In order to fulfill their true potential to leverage education, training, and research activities, innovations in telehealth applications should be fostered across a range of technology fronts, including online, on-demand computational models for simulation; simplified interfaces for software and hardware; software frameworks for simulations; portable telepresence systems; artificial intelligence applications to be applied when simulated human patients are not options; and the development of more simulator applications. This article presents the results of discussion on potential areas of future development, barries to overcome, and suggestions to translate the promise of telehealth applications into a transformed environment of training, education, and research in the health sciences.

  13. Simulation of existing gas-fuelled conventional steam power plant using Cycle Tempo

    NASA Astrophysics Data System (ADS)

    Jamel, M. S.; Abd Rahman, A.; Shamsuddin, A. H.

    2013-06-01

    Simulation of a 200 MW gas-fuelled conventional steam power plant located in Basra, Iraq was carried out. The thermodynamic performance of the considered power plant is estimated by a system simulation. A flow-sheet computer program, "Cycle-Tempo" is used for the study. The plant components and piping systems were considered and described in detail. The simulation results were verified against data gathered from the log sheet obtained from the station during its operation hours and good results were obtained. Operational factors like the stack exhaust temperature and excess air percentage were studied and discussed, as were environmental factors, such as ambient air temperature and water inlet temperature. In addition, detailed exergy losses were illustrated and describe the temperature profiles for the main plant components. The results prompted many suggestions for improvement of the plant performance.

  14. Global two-fluid simulations of geodesic acoustic modes in strongly shaped tight aspect ratio tokamak plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, J. R.; Hnat, B.; Thyagaraja, A.

    2013-05-15

    Following recent observations suggesting the presence of the geodesic acoustic mode (GAM) in ohmically heated discharges in the Mega Amp Spherical Tokamak (MAST) [J. R. Robinson et al., Plasma Phys. Controlled Fusion 54, 105007 (2012)], the behaviour of the GAM is studied numerically using the two fluid, global code CENTORI [P. J. Knight et al. Comput. Phys. Commun. 183, 2346 (2012)]. We examine mode localisation and effects of magnetic geometry, given by aspect ratio, elongation, and safety factor, on the observed frequency of the mode. An excellent agreement between simulations and experimental data is found for simulation plasma parameters matchedmore » to those of MAST. Increasing aspect ratio yields good agreement between the GAM frequency found in the simulations and an analytical result obtained for elongated large aspect ratio plasmas.« less

  15. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    NASA Astrophysics Data System (ADS)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  16. QCE: A Simulator for Quantum Computer Hardware

    NASA Astrophysics Data System (ADS)

    Michielsen, Kristel; de Raedt, Hans

    2003-09-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms. QCE runs in a Windows 98/NT/2000/ME/XP environment. It can be used to validate designs of physically realizable quantum processors and as an interactive educational tool to learn about quantum computers and quantum algorithms. A detailed exposition is given of the implementation of the CNOT and the Toffoli gate, the quantum Fourier transform, Grover's database search algorithm, an order finding algorithm, Shor's algorithm, a three-input adder and a number partitioning algorithm. We also review the results of simulations of an NMR-like quantum computer.

  17. Impact of pharmacy automation on patient waiting time: an application of computer simulation.

    PubMed

    Tan, Woan Shin; Chua, Siang Li; Yong, Keng Woh; Wu, Tuck Seng

    2009-06-01

    This paper aims to illustrate the use of computer simulation in evaluating the impact of a prototype automated dispensing system on waiting time in an outpatient pharmacy and its potential as a routine tool in pharmacy management. A discrete event simulation model was developed to investigate the impact of a prototype automated dispensing system on operational efficiency and service standards in an outpatient pharmacy. The simulation results suggest that automating the prescription-filing function using a prototype that picks and packs at 20 seconds per item will not assist the pharmacy in achieving the waiting time target of 30 minutes for all patients. Regardless of the state of automation, to meet the waiting time target, 2 additional pharmacists are needed to overcome the process bottleneck at the point of medication dispense. However, if the automated dispensing is the preferred option, the speed of the system needs to be twice as fast as the current configuration to facilitate the reduction of the 95th percentile patient waiting time to below 30 minutes. The faster processing speed will concomitantly allow the pharmacy to reduce the number of pharmacy technicians from 11 to 8. Simulation was found to be a useful and low cost method that allows an otherwise expensive and resource intensive evaluation of new work processes and technology to be completed within a short time.

  18. Uterus models for use in virtual reality hysteroscopy simulators.

    PubMed

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  19. [The research on bidirectional reflectance computer simulation of forest canopy at pixel scale].

    PubMed

    Song, Jin-Ling; Wang, Jin-Di; Shuai, Yan-Min; Xiao, Zhi-Qiang

    2009-08-01

    Computer simulation is based on computer graphics to generate the realistic 3D structure scene of vegetation, and to simulate the canopy regime using radiosity method. In the present paper, the authors expand the computer simulation model to simulate forest canopy bidirectional reflectance at pixel scale. But usually, the trees are complex structures, which are tall and have many branches. So there is almost a need for hundreds of thousands or even millions of facets to built up the realistic structure scene for the forest It is difficult for the radiosity method to compute so many facets. In order to make the radiosity method to simulate the forest scene at pixel scale, in the authors' research, the authors proposed one idea to simplify the structure of forest crowns, and abstract the crowns to ellipsoids. And based on the optical characteristics of the tree component and the characteristics of the internal energy transmission of photon in real crown, the authors valued the optical characteristics of ellipsoid surface facets. In the computer simulation of the forest, with the idea of geometrical optics model, the gap model is considered to get the forest canopy bidirectional reflectance at pixel scale. Comparing the computer simulation results with the GOMS model, and Multi-angle Imaging SpectroRadiometer (MISR) multi-angle remote sensing data, the simulation results are in agreement with the GOMS simulation result and MISR BRF. But there are also some problems to be solved. So the authors can conclude that the study has important value for the application of multi-angle remote sensing and the inversion of vegetation canopy structure parameters.

  20. Comparison of Building Loads Analysis and System Thermodynamics (BLAST) Computer Program Simulations and Measured Energy Use for Army Buildings.

    DTIC Science & Technology

    1980-05-01

    engineering ,ZteNo D R RPTE16 research w 9 laboratory COMPARISON OF BUILDING LOADS ANALYSIS AND SYSTEM THERMODYNAMICS (BLAST) AD 0 5 5,0 3COMPUTER PROGRAM...Building Loads Analysis and System Thermodynamics (BLAST) computer program. A dental clinic and a battalion headquarters and classroom building were...Building and HVAC System Data Computer Simulation Comparison of Actual and Simulated Results ANALYSIS AND FINDINGS

  1. The Ghost of Computers Past, Present, and Future: Computer Use for Preservice/Inservice Reading Programs.

    ERIC Educational Resources Information Center

    Prince, Amber T.

    Computer assisted instruction, and especially computer simulations, can help to ensure that preservice and inservice teachers learn from the right experiences. In the past, colleges of education used large mainframe computer systems to store student registration, provide simulation lessons on diagnosing reading difficulties, construct informal…

  2. Parallel computing method for simulating hydrological processesof large rivers under climate change

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.

    2016-12-01

    Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.

  3. Computer simulation of space charge

    NASA Astrophysics Data System (ADS)

    Yu, K. W.; Chung, W. K.; Mak, S. S.

    1991-05-01

    Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.

  4. Computer simulation results of attitude estimation of earth orbiting satellites

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1976-01-01

    Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

  5. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  6. Computer Simulation of the Circulation Subsystem of a Library

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  7. Progress in Unsteady Turbopump Flow Simulations

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin C.; Chan, William; Kwak, Dochan; Williams, Robert

    2002-01-01

    This viewgraph presentation discusses unsteady flow simulations for a turbopump intended for a reusable launch vehicle (RLV). The simulation process makes use of computational grids and parallel processing. The architecture of the parallel computers used is discussed, as is the scripting of turbopump simulations.

  8. Exploration of factors that affect the comparative effectiveness of physical and virtual manipulatives in an undergraduate laboratory

    NASA Astrophysics Data System (ADS)

    Chini, Jacquelyn J.; Madsen, Adrian; Gire, Elizabeth; Rebello, N. Sanjay; Puntambekar, Sadhana

    2012-06-01

    Recent research results have failed to support the conventionally held belief that students learn physics best from hands-on experiences with physical equipment. Rather, studies have found that students who perform similar experiments with computer simulations perform as well or better on measures of conceptual understanding than their peers who used physical equipment. In this study, we explored how university-level nonscience majors’ understanding of the physics concepts related to pulleys was supported by experimentation with real pulleys and a computer simulation of pulleys. We report that when students use one type of manipulative (physical or virtual), the comparison is influenced both by the concept studied and the timing of the post-test. Students performed similarly on questions related to force and mechanical advantage regardless of the type of equipment used. On the other hand, students who used the computer simulation performed better on questions related to work immediately after completing the activities; however, the two groups performed similarly on the work questions on a test given one week later. Additionally, both sequences of experimentation (physical-virtual and virtual-physical) equally supported students’ understanding of all of the concepts. These results suggest that both the concept learned and the stability of learning gains should continue to be explored to improve educators’ ability to select the best learning experience for a given topic.

  9. Older People’s Perceptions of Pedestrian Friendliness and Traffic Safety: An Experiment Using Computer-Simulated Walking Environments

    PubMed Central

    Kahlert, Daniela; Schlicht, Wolfgang

    2015-01-01

    Traffic safety and pedestrian friendliness are considered to be important conditions for older people’s motivation to walk through their environment. This study uses an experimental study design with computer-simulated living environments to investigate the effect of micro-scale environmental factors (parking spaces and green verges with trees) on older people’s perceptions of both motivational antecedents (dependent variables). Seventy-four consecutively recruited older people were randomly assigned watching one of two scenarios (independent variable) on a computer screen. The scenarios simulated a stroll on a sidewalk, as it is ‘typical’ for a German city. In version ‘A,’ the subjects take a fictive walk on a sidewalk where a number of cars are parked partially on it. In version ‘B’, cars are in parking spaces separated from the sidewalk by grass verges and trees. Subjects assessed their impressions of both dependent variables. A multivariate analysis of covariance showed that subjects’ ratings on perceived traffic safety and pedestrian friendliness were higher for Version ‘B’ compared to version ‘A’. Cohen’s d indicates medium (d = 0.73) and large (d = 1.23) effect sizes for traffic safety and pedestrian friendliness, respectively. The study suggests that elements of the built environment might affect motivational antecedents of older people’s walking behavior. PMID:26308026

  10. Experimental investigations, modeling, and analyses of high-temperature devices for space applications: Part 1. Final report, June 1996--December 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournier, J.; El-Genk, M.S.; Huang, L.

    1999-01-01

    The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less

  11. Experimental investigations, modeling, and analyses of high-temperature devices for space applications: Part 2. Final report, June 1996--December 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournier, J.; El-Genk, M.S.; Huang, L.

    1999-01-01

    The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less

  12. Computational Knee Ligament Modeling Using Experimentally Determined Zero-Load Lengths

    PubMed Central

    Bloemker, Katherine H; Guess, Trent M; Maletsky, Lorin; Dodd, Kevin

    2012-01-01

    This study presents a subject-specific method of determining the zero-load lengths of the cruciate and collateral ligaments in computational knee modeling. Three cadaver knees were tested in a dynamic knee simulator. The cadaver knees also underwent manual envelope of motion testing to find their passive range of motion in order to determine the zero-load lengths for each ligament bundle. Computational multibody knee models were created for each knee and model kinematics were compared to experimental kinematics for a simulated walk cycle. One-dimensional non-linear spring damper elements were used to represent cruciate and collateral ligament bundles in the knee models. This study found that knee kinematics were highly sensitive to altering of the zero-load length. The results also suggest optimal methods for defining each of the ligament bundle zero-load lengths, regardless of the subject. These results verify the importance of the zero-load length when modeling the knee joint and verify that manual envelope of motion measurements can be used to determine the passive range of motion of the knee joint. It is also believed that the method described here for determining zero-load length can be used for in vitro or in vivo subject-specific computational models. PMID:22523522

  13. One-dimensional collision carts computer model and its design ideas for productive experiential learning

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang

    2012-05-01

    We develop an Easy Java Simulation (EJS) model for students to experience the physics of idealized one-dimensional collision carts. The physics model is described and simulated by both continuous dynamics and discrete transition during collision. In designing the simulations, we discuss briefly three pedagogical considerations namely (1) a consistent simulation world view with a pen and paper representation, (2) a data table, scientific graphs and symbolic mathematical representations for ease of data collection and multiple representational visualizations and (3) a game for simple concept testing that can further support learning. We also suggest using a physical world setup augmented by simulation by highlighting three advantages of real collision carts equipment such as a tacit 3D experience, random errors in measurement and the conceptual significance of conservation of momentum applied to just before and after collision. General feedback from the students has been relatively positive, and we hope teachers will find the simulation useful in their own classes.

  14. Physics Computing '92: Proceedings of the 4th International Conference

    NASA Astrophysics Data System (ADS)

    de Groot, Robert A.; Nadrchal, Jaroslav

    1993-04-01

    The Table of Contents for the book is as follows: * Preface * INVITED PAPERS * Ab Initio Theoretical Approaches to the Structural, Electronic and Vibrational Properties of Small Clusters and Fullerenes: The State of the Art * Neural Multigrid Methods for Gauge Theories and Other Disordered Systems * Multicanonical Monte Carlo Simulations * On the Use of the Symbolic Language Maple in Physics and Chemistry: Several Examples * Nonequilibrium Phase Transitions in Catalysis and Population Models * Computer Algebra, Symmetry Analysis and Integrability of Nonlinear Evolution Equations * The Path-Integral Quantum Simulation of Hydrogen in Metals * Digital Optical Computing: A New Approach of Systolic Arrays Based on Coherence Modulation of Light and Integrated Optics Technology * Molecular Dynamics Simulations of Granular Materials * Numerical Implementation of a K.A.M. Algorithm * Quasi-Monte Carlo, Quasi-Random Numbers and Quasi-Error Estimates * What Can We Learn from QMC Simulations * Physics of Fluctuating Membranes * Plato, Apollonius, and Klein: Playing with Spheres * Steady States in Nonequilibrium Lattice Systems * CONVODE: A REDUCE Package for Differential Equations * Chaos in Coupled Rotators * Symplectic Numerical Methods for Hamiltonian Problems * Computer Simulations of Surfactant Self Assembly * High-dimensional and Very Large Cellular Automata for Immunological Shape Space * A Review of the Lattice Boltzmann Method * Electronic Structure of Solids in the Self-interaction Corrected Local-spin-density Approximation * Dedicated Computers for Lattice Gauge Theory Simulations * Physics Education: A Survey of Problems and Possible Solutions * Parallel Computing and Electronic-Structure Theory * High Precision Simulation Techniques for Lattice Field Theory * CONTRIBUTED PAPERS * Case Study of Microscale Hydrodynamics Using Molecular Dynamics and Lattice Gas Methods * Computer Modelling of the Structural and Electronic Properties of the Supported Metal Catalysis * Ordered Particle Simulations for Serial and MIMD Parallel Computers * "NOLP" -- Program Package for Laser Plasma Nonlinear Optics * Algorithms to Solve Nonlinear Least Square Problems * Distribution of Hydrogen Atoms in Pd-H Computed by Molecular Dynamics * A Ray Tracing of Optical System for Protein Crystallography Beamline at Storage Ring-SIBERIA-2 * Vibrational Properties of a Pseudobinary Linear Chain with Correlated Substitutional Disorder * Application of the Software Package Mathematica in Generalized Master Equation Method * Linelist: An Interactive Program for Analysing Beam-foil Spectra * GROMACS: A Parallel Computer for Molecular Dynamics Simulations * GROMACS Method of Virial Calculation Using a Single Sum * The Interactive Program for the Solution of the Laplace Equation with the Elimination of Singularities for Boundary Functions * Random-Number Generators: Testing Procedures and Comparison of RNG Algorithms * Micro-TOPIC: A Tokamak Plasma Impurities Code * Rotational Molecular Scattering Calculations * Orthonormal Polynomial Method for Calibrating of Cryogenic Temperature Sensors * Frame-based System Representing Basis of Physics * The Role of Massively Data-parallel Computers in Large Scale Molecular Dynamics Simulations * Short-range Molecular Dynamics on a Network of Processors and Workstations * An Algorithm for Higher-order Perturbation Theory in Radiative Transfer Computations * Hydrostochastics: The Master Equation Formulation of Fluid Dynamics * HPP Lattice Gas on Transputers and Networked Workstations * Study on the Hysteresis Cycle Simulation Using Modeling with Different Functions on Intervals * Refined Pruning Techniques for Feed-forward Neural Networks * Random Walk Simulation of the Motion of Transient Charges in Photoconductors * The Optical Hysteresis in Hydrogenated Amorphous Silicon * Diffusion Monte Carlo Analysis of Modern Interatomic Potentials for He * A Parallel Strategy for Molecular Dynamics Simulations of Polar Liquids on Transputer Arrays * Distribution of Ions Reflected on Rough Surfaces * The Study of Step Density Distribution During Molecular Beam Epitaxy Growth: Monte Carlo Computer Simulation * Towards a Formal Approach to the Construction of Large-scale Scientific Applications Software * Correlated Random Walk and Discrete Modelling of Propagation through Inhomogeneous Media * Teaching Plasma Physics Simulation * A Theoretical Determination of the Au-Ni Phase Diagram * Boson and Fermion Kinetics in One-dimensional Lattices * Computational Physics Course on the Technical University * Symbolic Computations in Simulation Code Development and Femtosecond-pulse Laser-plasma Interaction Studies * Computer Algebra and Integrated Computing Systems in Education of Physical Sciences * Coordinated System of Programs for Undergraduate Physics Instruction * Program Package MIRIAM and Atomic Physics of Extreme Systems * High Energy Physics Simulation on the T_Node * The Chapman-Kolmogorov Equation as Representation of Huygens' Principle and the Monolithic Self-consistent Numerical Modelling of Lasers * Authoring System for Simulation Developments * Molecular Dynamics Study of Ion Charge Effects in the Structure of Ionic Crystals * A Computational Physics Introductory Course * Computer Calculation of Substrate Temperature Field in MBE System * Multimagnetical Simulation of the Ising Model in Two and Three Dimensions * Failure of the CTRW Treatment of the Quasicoherent Excitation Transfer * Implementation of a Parallel Conjugate Gradient Method for Simulation of Elastic Light Scattering * Algorithms for Study of Thin Film Growth * Algorithms and Programs for Physics Teaching in Romanian Technical Universities * Multicanonical Simulation of 1st order Transitions: Interface Tension of the 2D 7-State Potts Model * Two Numerical Methods for the Calculation of Periodic Orbits in Hamiltonian Systems * Chaotic Behavior in a Probabilistic Cellular Automata? * Wave Optics Computing by a Networked-based Vector Wave Automaton * Tensor Manipulation Package in REDUCE * Propagation of Electromagnetic Pulses in Stratified Media * The Simple Molecular Dynamics Model for the Study of Thermalization of the Hot Nucleon Gas * Electron Spin Polarization in PdCo Alloys Calculated by KKR-CPA-LSD Method * Simulation Studies of Microscopic Droplet Spreading * A Vectorizable Algorithm for the Multicolor Successive Overrelaxation Method * Tetragonality of the CuAu I Lattice and Its Relation to Electronic Specific Heat and Spin Susceptibility * Computer Simulation of the Formation of Metallic Aggregates Produced by Chemical Reactions in Aqueous Solution * Scaling in Growth Models with Diffusion: A Monte Carlo Study * The Nucleus as the Mesoscopic System * Neural Network Computation as Dynamic System Simulation * First-principles Theory of Surface Segregation in Binary Alloys * Data Smooth Approximation Algorithm for Estimating the Temperature Dependence of the Ice Nucleation Rate * Genetic Algorithms in Optical Design * Application of 2D-FFT in the Study of Molecular Exchange Processes by NMR * Advanced Mobility Model for Electron Transport in P-Si Inversion Layers * Computer Simulation for Film Surfaces and its Fractal Dimension * Parallel Computation Techniques and the Structure of Catalyst Surfaces * Educational SW to Teach Digital Electronics and the Corresponding Text Book * Primitive Trinomials (Mod 2) Whose Degree is a Mersenne Exponent * Stochastic Modelisation and Parallel Computing * Remarks on the Hybrid Monte Carlo Algorithm for the ∫4 Model * An Experimental Computer Assisted Workbench for Physics Teaching * A Fully Implicit Code to Model Tokamak Plasma Edge Transport * EXPFIT: An Interactive Program for Automatic Beam-foil Decay Curve Analysis * Mapping Technique for Solving General, 1-D Hamiltonian Systems * Freeway Traffic, Cellular Automata, and Some (Self-Organizing) Criticality * Photonuclear Yield Analysis by Dynamic Programming * Incremental Representation of the Simply Connected Planar Curves * Self-convergence in Monte Carlo Methods * Adaptive Mesh Technique for Shock Wave Propagation * Simulation of Supersonic Coronal Streams and Their Interaction with the Solar Wind * The Nature of Chaos in Two Systems of Ordinary Nonlinear Differential Equations * Considerations of a Window-shopper * Interpretation of Data Obtained by RTP 4-Channel Pulsed Radar Reflectometer Using a Multi Layer Perceptron * Statistics of Lattice Bosons for Finite Systems * Fractal Based Image Compression with Affine Transformations * Algorithmic Studies on Simulation Codes for Heavy-ion Reactions * An Energy-Wise Computer Simulation of DNA-Ion-Water Interactions Explains the Abnormal Structure of Poly[d(A)]:Poly[d(T)] * Computer Simulation Study of Kosterlitz-Thouless-Like Transitions * Problem-oriented Software Package GUN-EBT for Computer Simulation of Beam Formation and Transport in Technological Electron-Optical Systems * Parallelization of a Boundary Value Solver and its Application in Nonlinear Dynamics * The Symbolic Classification of Real Four-dimensional Lie Algebras * Short, Singular Pulses Generation by a Dye Laser at Two Wavelengths Simultaneously * Quantum Monte Carlo Simulations of the Apex-Oxygen-Model * Approximation Procedures for the Axial Symmetric Static Einstein-Maxwell-Higgs Theory * Crystallization on a Sphere: Parallel Simulation on a Transputer Network * FAMULUS: A Software Product (also) for Physics Education * MathCAD vs. FAMULUS -- A Brief Comparison * First-principles Dynamics Used to Study Dissociative Chemisorption * A Computer Controlled System for Crystal Growth from Melt * A Time Resolved Spectroscopic Method for Short Pulsed Particle Emission * Green's Function Computation in Radiative Transfer Theory * Random Search Optimization Technique for One-criteria and Multi-criteria Problems * Hartley Transform Applications to Thermal Drift Elimination in Scanning Tunneling Microscopy * Algorithms of Measuring, Processing and Interpretation of Experimental Data Obtained with Scanning Tunneling Microscope * Time-dependent Atom-surface Interactions * Local and Global Minima on Molecular Potential Energy Surfaces: An Example of N3 Radical * Computation of Bifurcation Surfaces * Symbolic Computations in Quantum Mechanics: Energies in Next-to-solvable Systems * A Tool for RTP Reactor and Lamp Field Design * Modelling of Particle Spectra for the Analysis of Solid State Surface * List of Participants

  15. Reconsidering Simulations in Science Education at a Distance: Features of Effective Use

    ERIC Educational Resources Information Center

    Blake, C.; Scanlon, E.

    2007-01-01

    This paper proposes a reconsideration of use of computer simulations in science education. We discuss three studies of the use of science simulations for undergraduate distance learning students. The first one, "The Driven Pendulum" simulation is a computer-based experiment on the behaviour of a pendulum. The second simulation, "Evolve" is…

  16. Evaluation of Rankine cycle air conditioning system hardware by computer simulation

    NASA Technical Reports Server (NTRS)

    Healey, H. M.; Clark, D.

    1978-01-01

    A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.

  17. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  18. Experimental and Computational Study of Sonic and Supersonic Jet Plumes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Naughton, J. W.; Fletcher, D. G.; Edwards, Thomas A. (Technical Monitor)

    1994-01-01

    Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock-shear-layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.

  19. Simulations of the effects of proppant placement on the conductivity and mechanical stability of hydraulic fractures

    DOE PAGES

    Bolintineanu, Dan S.; Rao, Rekha R.; Lechman, Jeremy B.; ...

    2017-11-05

    Here, we generate a wide range of models of proppant-packed fractures using discrete element simulations, and measure fracture conductivity using finite element flow simulations. This allows for a controlled computational study of proppant structure and its relationship to fracture conductivity and stress in the proppant pack. For homogeneous multi-layered packings, we observe the expected increase in fracture conductivity with increasing fracture aperture, while the stress on the proppant pack remains nearly constant. This is consistent with the expected behavior in conventional proppant-packed fractures, but the present work offers a novel quantitative analysis with an explicit geometric representation of the proppantmore » particles. In single-layered packings (i.e. proppant monolayers), there is a drastic increase in fracture conductivity as the proppant volume fraction decreases and open flow channels form. However, this also corresponds to a sharp increase in the mechanical stress on the proppant pack, as measured by the maximum normal stress relative to the side crushing strength of typical proppant particles. We also generate a variety of computational geometries that resemble highly heterogeneous proppant packings hypothesized to form during channel fracturing. In some cases, these heterogeneous packings show drastic improvements in conductivity with only moderate increase in the stress on the proppant particles, suggesting that in certain applications these structures are indeed optimal. We also compare our computer-generated structures to micro computed tomography imaging of a manually fractured laboratory-scale shale specimen, and find reasonable agreement in the geometric characteristics.« less

  20. Separating figure from ground with a parallel network.

    PubMed

    Kienker, P K; Sejnowski, T J; Hinton, G E; Schumacher, L E

    1986-01-01

    The differentiation of figure from ground plays an important role in the perceptual organization of visual stimuli. The rapidity with which we can discriminate the inside from the outside of a figure suggests that at least this step in the process may be performed in visual cortex by a large number of neurons in several different areas working together in parallel. We have attempted to simulate this collective computation by designing a network of simple processing units that receives two types of information: bottom-up input from the image containing the outlines of a figure, which may be incomplete, and a top-down attentional input that biases one part of the image to be the inside of the figure. No presegmentation of the image was assumed. Two methods for performing the computation were explored: gradient descent, which seeks locally optimal states, and simulated annealing, which attempts to find globally optimal states by introducing noise into the computation. For complete outlines, gradient descent was faster, but the range of input parameters leading to successful performance was very narrow. In contrast, simulated annealing was more robust: it worked over a wider range of attention parameters and a wider range of outlines, including incomplete ones. Our network model is too simplified to serve as a model of human performance, but it does demonstrate that one global property of outlines can be computed through local interactions in a parallel network. Some features of the model, such as the role of noise in escaping from nonglobal optima, may generalize to more realistic models.

  1. Simulations of the effects of proppant placement on the conductivity and mechanical stability of hydraulic fractures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolintineanu, Dan S.; Rao, Rekha R.; Lechman, Jeremy B.

    Here, we generate a wide range of models of proppant-packed fractures using discrete element simulations, and measure fracture conductivity using finite element flow simulations. This allows for a controlled computational study of proppant structure and its relationship to fracture conductivity and stress in the proppant pack. For homogeneous multi-layered packings, we observe the expected increase in fracture conductivity with increasing fracture aperture, while the stress on the proppant pack remains nearly constant. This is consistent with the expected behavior in conventional proppant-packed fractures, but the present work offers a novel quantitative analysis with an explicit geometric representation of the proppantmore » particles. In single-layered packings (i.e. proppant monolayers), there is a drastic increase in fracture conductivity as the proppant volume fraction decreases and open flow channels form. However, this also corresponds to a sharp increase in the mechanical stress on the proppant pack, as measured by the maximum normal stress relative to the side crushing strength of typical proppant particles. We also generate a variety of computational geometries that resemble highly heterogeneous proppant packings hypothesized to form during channel fracturing. In some cases, these heterogeneous packings show drastic improvements in conductivity with only moderate increase in the stress on the proppant particles, suggesting that in certain applications these structures are indeed optimal. We also compare our computer-generated structures to micro computed tomography imaging of a manually fractured laboratory-scale shale specimen, and find reasonable agreement in the geometric characteristics.« less

  2. An Improved Computing Method for 3D Mechanical Connectivity Rates Based on a Polyhedral Simulation Model of Discrete Fracture Network in Rock Masses

    NASA Astrophysics Data System (ADS)

    Li, Mingchao; Han, Shuai; Zhou, Sibao; Zhang, Ye

    2018-06-01

    Based on a 3D model of a discrete fracture network (DFN) in a rock mass, an improved projective method for computing the 3D mechanical connectivity rate was proposed. The Monte Carlo simulation method, 2D Poisson process and 3D geological modeling technique were integrated into a polyhedral DFN modeling approach, and the simulation results were verified by numerical tests and graphical inspection. Next, the traditional projective approach for calculating the rock mass connectivity rate was improved using the 3D DFN models by (1) using the polyhedral model to replace the Baecher disk model; (2) taking the real cross section of the rock mass, rather than a part of the cross section, as the test plane; and (3) dynamically searching the joint connectivity rates using different dip directions and dip angles at different elevations to calculate the maximum, minimum and average values of the joint connectivity at each elevation. In a case study, the improved method and traditional method were used to compute the mechanical connectivity rate of the slope of a dam abutment. The results of the two methods were further used to compute the cohesive force of the rock masses. Finally, a comparison showed that the cohesive force derived from the traditional method had a higher error, whereas the cohesive force derived from the improved method was consistent with the suggested values. According to the comparison, the effectivity and validity of the improved method were verified indirectly.

  3. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  4. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focused on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for the increased understanding of the physical processes governing ice accretion, ice shedding, and iced aerodynamics is examined.

  5. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  6. Preferential Concentration Of Solid Particles In Turbulent Horizontal Circular Pipe Flow

    NASA Astrophysics Data System (ADS)

    Kim, Jaehee; Yang, Kyung-Soo

    2017-11-01

    In particle-laden turbulent pipe flow, turbophoresis can lead to a preferential concentration of particles near the wall. To investigate this phenomenon, one-way coupled Direct Numerical Simulation (DNS) has been performed. Fully-developed turbulent pipe flow of the carrier fluid (air) is at Reτ = 200 based on the pipe radius and the mean friction velocity, whereas the Stokes numbers of the particles (solid) are St+ = 0.1 , 1 , 10 based on the mean friction velocity and the kinematic viscosity of the fluid. The computational domain for particle simulation is extended along the axial direction by duplicating the domain of the fluid simulation. By doing so, particle statistics in the spatially developing region as well as in the fully-developed region can be obtained. Accumulation of particles has been noticed at St+ = 1 and 10 mostly in the viscous sublayer, more intensive in the latter case. Compared with other authors' previous results, our results suggest that drag force on the particles should be computed by using an empirical correlation and a higher-order interpolation scheme even in a low-Re regime in order to improve the accuracy of particle simulation. This work was supported by the National Research Foundation of Korea (NRF) Grant funded by the Korea government (MSIP) (No. 2015R1A2A2A01002981).

  7. Data-Driven Correlation Analysis Between Observed 3D Fatigue-Crack Path and Computed Fields from High-Fidelity, Crystal-Plasticity, Finite-Element Simulations

    NASA Astrophysics Data System (ADS)

    Pierson, Kyle D.; Hochhalter, Jacob D.; Spear, Ashley D.

    2018-05-01

    Systematic correlation analysis was performed between simulated micromechanical fields in an uncracked polycrystal and the known path of an eventual fatigue-crack surface based on experimental observation. Concurrent multiscale finite-element simulation of cyclic loading was performed using a high-fidelity representation of grain structure obtained from near-field high-energy x-ray diffraction microscopy measurements. An algorithm was developed to parameterize and systematically correlate the three-dimensional (3D) micromechanical fields from simulation with the 3D fatigue-failure surface from experiment. For comparison, correlation coefficients were also computed between the micromechanical fields and hypothetical, alternative surfaces. The correlation of the fields with hypothetical surfaces was found to be consistently weaker than that with the known crack surface, suggesting that the micromechanical fields of the cyclically loaded, uncracked microstructure might provide some degree of predictiveness for microstructurally small fatigue-crack paths, although the extent of such predictiveness remains to be tested. In general, gradients of the field variables exhibit stronger correlations with crack path than the field variables themselves. Results from the data-driven approach implemented here can be leveraged in future model development for prediction of fatigue-failure surfaces (for example, to facilitate univariate feature selection required by convolution-based models).

  8. Mesoscale Effective Property Simulations Incorporating Conductive Binder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trembacki, Bradley L.; Noble, David R.; Brunini, Victor E.

    Lithium-ion battery electrodes are composed of active material particles, binder, and conductive additives that form an electrolyte-filled porous particle composite. The mesoscale (particle-scale) interplay of electrochemistry, mechanical deformation, and transport through this tortuous multi-component network dictates the performance of a battery at the cell-level. Effective electrode properties connect mesoscale phenomena with computationally feasible battery-scale simulations. We utilize published tomography data to reconstruct a large subsection (1000+ particles) of an NMC333 cathode into a computational mesh and extract electrode-scale effective properties from finite element continuum-scale simulations. We present a novel method to preferentially place a composite binder phase throughout the mesostructure,more » a necessary approach due difficulty distinguishing between non-active phases in tomographic data. We compare stress generation and effective thermal, electrical, and ionic conductivities across several binder placement approaches. Isotropic lithiation-dependent mechanical swelling of the NMC particles and the consideration of strain-dependent composite binder conductivity significantly impact the resulting effective property trends and stresses generated. Lastly, our results suggest that composite binder location significantly affects mesoscale behavior, indicating that a binder coating on active particles is not sufficient and that more accurate approaches should be used when calculating effective properties that will inform battery-scale models in this inherently multi-scale battery simulation challenge.« less

  9. Object-oriented Technology for Compressor Simulation

    NASA Technical Reports Server (NTRS)

    Drummond, C. K.; Follen, G. J.; Cannon, M. R.

    1994-01-01

    An object-oriented basis for interdisciplinary compressor simulation can, in principle, overcome several barriers associated with the traditional structured (procedural) development approach. This paper presents the results of a research effort with the objective to explore the repercussions on design, analysis, and implementation of a compressor model in an object oriented (OO) language, and to examine the ability of the OO system design to accommodate computational fluid dynamics (CFD) code for compressor performance prediction. Three fundamental results are that: (1) the selection of the object oriented language is not the central issue; enhanced (interdisciplinary) analysis capability derives from a broader focus on object-oriented technology; (2) object-oriented designs will produce more effective and reusable computer programs when the technology is applied to issues involving complex system inter-relationships (more so than when addressing the complex physics of an isolated discipline); and (3) the concept of disposable prototypes is effective for exploratory research programs, but this requires organizations to have a commensurate long-term perspective. This work also suggests that interdisciplinary simulation can be effectively accomplished (over several levels of fidelity) with a mixed language treatment (i.e., FORTRAN-C++), reinforcing the notion the OO technology implementation into simulations is a 'journey' in which the syntax can, by design, continuously evolve.

  10. Molecular dynamics simulations on the inhibition of cyclin-dependent kinases 2 and 5 in the presence of activators.

    PubMed

    Zhang, Bing; Tan, Vincent B C; Lim, Kian Meng; Tay, Tong Earn

    2006-06-01

    Interests in CDK2 and CDK5 have stemmed mainly from their association with cancer and neuronal migration or differentiation related diseases and the need to design selective inhibitors for these kinases. Molecular dynamics (MD) simulations have not only become a viable approach to drug design because of advances in computer technology but are increasingly an integral part of drug discovery processes. It is common in MD simulations of inhibitor/CDK complexes to exclude the activator of the CDKs in the structural models to keep computational time tractable. In this paper, we present simulation results of CDK2 and CDK5 with roscovitine using models with and without their activators (cyclinA and p25). While p25 was found to induce slight changes in CDK5, the calculations support that cyclinA leads to significant conformational changes near the active site of CDK2. This suggests that detailed and structure-based inhibitor design targeted at these CDKs should employ activator-included models of the kinases. Comparisons between P/CDK2/cyclinA/roscovitine and CDK5/p25/roscovitine complexes reveal differences in the conformations of the glutamine around the active sites, which may be exploited to find highly selective inhibitors with respect to CDK2 and CDK5.

  11. Cone-beam computed tomography-based diagnosis and treatment simulation for a patient with a protrusive profile and a gummy smile

    PubMed Central

    Imamura, Toshihiro; Kokai, Satoshi; Ono, Takashi

    2018-01-01

    For patients with bimaxillary protrusion, significant retraction and intrusion of the anterior teeth are sometimes essential to improve the facial profile. However, severe root resorption of the maxillary incisors occasionally occurs after treatment because of various factors. For instance, it has been reported that approximation or invasion of the incisive canal by the anterior tooth roots during retraction may cause apical root damage. Thus, determination of the position of the maxillary incisors is key for orthodontic diagnosis and treatment planning in such cases. Cone-beam computed tomography (CBCT) may be useful for simulating the post-treatment position of the maxillary incisors and surrounding structures in order to ensure safe teeth movement. Here, we present a case of Class II malocclusion with bimaxillary protrusion, wherein apical root damage due to treatment was minimized by pretreatment evaluation of the anatomical structures and simulation of the maxillary central incisor movement using CBCT. Considerable retraction and intrusion of the maxillary incisors, which resulted in a significant improvement in the facial profile and smile, were achieved without severe root resorption. Our findings suggest that CBCT-based diagnosis and treatment simulation may facilitate safe and dynamic orthodontic tooth movement, particularly in patients requiring maximum anterior tooth retraction. PMID:29732305

  12. Mesoscale Effective Property Simulations Incorporating Conductive Binder

    DOE PAGES

    Trembacki, Bradley L.; Noble, David R.; Brunini, Victor E.; ...

    2017-07-26

    Lithium-ion battery electrodes are composed of active material particles, binder, and conductive additives that form an electrolyte-filled porous particle composite. The mesoscale (particle-scale) interplay of electrochemistry, mechanical deformation, and transport through this tortuous multi-component network dictates the performance of a battery at the cell-level. Effective electrode properties connect mesoscale phenomena with computationally feasible battery-scale simulations. We utilize published tomography data to reconstruct a large subsection (1000+ particles) of an NMC333 cathode into a computational mesh and extract electrode-scale effective properties from finite element continuum-scale simulations. We present a novel method to preferentially place a composite binder phase throughout the mesostructure,more » a necessary approach due difficulty distinguishing between non-active phases in tomographic data. We compare stress generation and effective thermal, electrical, and ionic conductivities across several binder placement approaches. Isotropic lithiation-dependent mechanical swelling of the NMC particles and the consideration of strain-dependent composite binder conductivity significantly impact the resulting effective property trends and stresses generated. Lastly, our results suggest that composite binder location significantly affects mesoscale behavior, indicating that a binder coating on active particles is not sufficient and that more accurate approaches should be used when calculating effective properties that will inform battery-scale models in this inherently multi-scale battery simulation challenge.« less

  13. Richardson-Lucy/maximum likelihood image restoration algorithm for fluorescence microscopy: further testing.

    PubMed

    Holmes, T J; Liu, Y H

    1989-11-15

    A maximum likelihood based iterative algorithm adapted from nuclear medicine imaging for noncoherent optical imaging was presented in a previous publication with some initial computer-simulation testing. This algorithm is identical in form to that previously derived in a different way by W. H. Richardson "Bayesian-Based Iterative Method of Image Restoration," J. Opt. Soc. Am. 62, 55-59 (1972) and L. B. Lucy "An Iterative Technique for the Rectification of Observed Distributions," Astron. J. 79, 745-765 (1974). Foreseen applications include superresolution and 3-D fluorescence microscopy. This paper presents further simulation testing of this algorithm and a preliminary experiment with a defocused camera. The simulations show quantified resolution improvement as a function of iteration number, and they show qualitatively the trend in limitations on restored resolution when noise is present in the data. Also shown are results of a simulation in restoring missing-cone information for 3-D imaging. Conclusions are in support of the feasibility of using these methods with real systems, while computational cost and timing estimates indicate that it should be realistic to implement these methods. Itis suggested in the Appendix that future extensions to the maximum likelihood based derivation of this algorithm will address some of the limitations that are experienced with the nonextended form of the algorithm presented here.

  14. Patterned corneal collagen crosslinking for astigmatism: Computational modeling study

    PubMed Central

    Seven, Ibrahim; Roy, Abhijit Sinha; Dupps, William J.

    2014-01-01

    PURPOSE To test the hypothesis that spatially selective corneal stromal stiffening can alter corneal astigmatism and assess the effects of treatment orientation, pattern, and material model complexity in computational models using patient-specific geometries. SETTING Cornea and Refractive Surgery Service, Academic Eye Institute, Cleveland, Ohio, USA. DESIGN Computational modeling study. METHODS Three-dimensional corneal geometries from 10 patients with corneal astigmatism were exported from a clinical tomography system (Pentacam). Corneoscleral finite element models of each eye were generated. Four candidate treatment patterns were simulated, and the effects of treatment orientation and magnitude of stiffening on anterior curvature and aberrations were studied. The effect of material model complexity on simulated outcomes was also assessed. RESULTS Pretreatment anterior corneal astigmatism ranged from 1.22 to 3.92 diopters (D) in a series that included regular and irregular astigmatic patterns. All simulated treatment patterns oriented on the flat axis resulted in mean reductions in corneal astigmatism and depended on the pattern geometry. The linear bow-tie pattern produced a greater mean reduction in astigmatism (1.08 D ± 0.13 [SD]; range 0.74 to 1.23 D) than other patterns tested under an assumed 2-times increase in corneal stiffness, and it had a nonlinear relationship to the degree of stiffening. The mean astigmatic effect did not change significantly with a fiber- or depth-dependent model, but it did affect the coupling ratio. CONCLUSIONS In silico simulations based on patient-specific geometries suggest that clinically significant reductions in astigmatism are possible with patterned collagen crosslinking. Effect magnitude was dependent on patient-specific geometry, effective stiffening pattern, and treatment orientation. PMID:24767795

  15. Generalized binomial τ-leap method for biochemical kinetics incorporating both delay and intrinsic noise

    NASA Astrophysics Data System (ADS)

    Leier, André; Marquez-Lago, Tatiana T.; Burrage, Kevin

    2008-05-01

    The delay stochastic simulation algorithm (DSSA) by Barrio et al. [Plos Comput. Biol. 2, 117(E) (2006)] was developed to simulate delayed processes in cell biology in the presence of intrinsic noise, that is, when there are small-to-moderate numbers of certain key molecules present in a chemical reaction system. These delayed processes can faithfully represent complex interactions and mechanisms that imply a number of spatiotemporal processes often not explicitly modeled such as transcription and translation, basic in the modeling of cell signaling pathways. However, for systems with widely varying reaction rate constants or large numbers of molecules, the simulation time steps of both the stochastic simulation algorithm (SSA) and the DSSA can become very small causing considerable computational overheads. In order to overcome the limit of small step sizes, various τ-leap strategies have been suggested for improving computational performance of the SSA. In this paper, we present a binomial τ-DSSA method that extends the τ-leap idea to the delay setting and avoids drawing insufficient numbers of reactions, a common shortcoming of existing binomial τ-leap methods that becomes evident when dealing with complex chemical interactions. The resulting inaccuracies are most evident in the delayed case, even when considering reaction products as potential reactants within the same time step in which they are produced. Moreover, we extend the framework to account for multicellular systems with different degrees of intercellular communication. We apply these ideas to two important genetic regulatory models, namely, the hes1 gene, implicated as a molecular clock, and a Her1/Her 7 model for coupled oscillating cells.

  16. Development of a PC-based diabetes simulator in collaboration with teenagers with type 1 diabetes.

    PubMed

    Nordfeldt, S; Hanberger, L; Malm, F; Ludvigsson, J

    2007-02-01

    The main aim of this study was to develop and test in a pilot study a PC-based interactive diabetes simulator prototype as a part of future Internet-based support systems for young teenagers and their families. A second aim was to gain experience in user-centered design (UCD) methods applied to such subjects. Using UCD methods, a computer scientist participated in iterative user group sessions involving teenagers with Type 1 diabetes 13-17 years old and parents. Input was transformed into a requirements specification by the computer scientist and advisors. This was followed by gradual prototype development based on a previously developed mathematical core. Individual test sessions were followed by a pilot study with five subjects testing a prototype. The process was evaluated by registration of flow and content of input and opinions from expert advisors. It was initially difficult to motivate teenagers to participate. User group discussion topics ranged from concrete to more academic matters. The issue of a simulator created active discussions among parents and teenagers. A large amount of input was generated from discussions among the teenagers. Individual test runs generated useful input. A pilot study suggested that the gradually elaborated software was functional. A PC-based diabetes simulator may create substantial interest among teenagers and parents, and the prototype seems worthy of further development and studies. UCD methods may generate significant input for computer support system design work and contribute to a functional design. Teenager involvement in design work may require time, patience, and flexibility.

  17. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  18. Human agency beliefs influence behaviour during virtual social interactions.

    PubMed

    Caruana, Nathan; Spirou, Dean; Brock, Jon

    2017-01-01

    In recent years, with the emergence of relatively inexpensive and accessible virtual reality technologies, it is now possible to deliver compelling and realistic simulations of human-to-human interaction. Neuroimaging studies have shown that, when participants believe they are interacting via a virtual interface with another human agent, they show different patterns of brain activity compared to when they know that their virtual partner is computer-controlled. The suggestion is that users adopt an "intentional stance" by attributing mental states to their virtual partner. However, it remains unclear how beliefs in the agency of a virtual partner influence participants' behaviour and subjective experience of the interaction. We investigated this issue in the context of a cooperative "joint attention" game in which participants interacted via an eye tracker with a virtual onscreen partner, directing each other's eye gaze to different screen locations. Half of the participants were correctly informed that their partner was controlled by a computer algorithm ("Computer" condition). The other half were misled into believing that the virtual character was controlled by a second participant in another room ("Human" condition). Those in the "Human" condition were slower to make eye contact with their partner and more likely to try and guide their partner before they had established mutual eye contact than participants in the "Computer" condition. They also responded more rapidly when their partner was guiding them, although the same effect was also found for a control condition in which they responded to an arrow cue. Results confirm the influence of human agency beliefs on behaviour in this virtual social interaction context. They further suggest that researchers and developers attempting to simulate social interactions should consider the impact of agency beliefs on user experience in other social contexts, and their effect on the achievement of the application's goals.

  19. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  20. Automating NEURON Simulation Deployment in Cloud Resources.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  1. Automating NEURON Simulation Deployment in Cloud Resources

    PubMed Central

    Santamaria, Fidel

    2016-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341

  2. Verifying a computational method for predicting extreme ground motion

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  3. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    PubMed

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  4. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    PubMed Central

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  5. Learning Relative Motion Concepts in Immersive and Non-immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria

    2013-12-01

    The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop virtual environment (DVE) conditions. Our results show that after the simulation activities, both IVE and DVE groups exhibited a significant shift toward a scientific understanding in their conceptual models and epistemological beliefs about the nature of relative motion, and also a significant improvement on relative motion problem-solving tests. In addition, we analyzed students' performance on one-dimensional and two-dimensional questions in the relative motion problem-solving test separately and found that after training in the simulation, the IVE group performed significantly better than the DVE group on solving two-dimensional relative motion problems. We suggest that egocentric encoding of the scene in IVE (where the learner constitutes a part of a scene they are immersed in), as compared to allocentric encoding on a computer screen in DVE (where the learner is looking at the scene from "outside"), is more beneficial than DVE for studying more complex (two-dimensional) relative motion problems. Overall, our findings suggest that such aspects of virtual realities as immersivity, first-hand experience, and the possibility of changing different frames of reference can facilitate understanding abstract scientific phenomena and help in displacing intuitive misconceptions with more accurate mental models.

  6. Tools for 3D scientific visualization in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.

  7. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  8. Computational Analysis of the Flow and Acoustic Effects of Jet-Pylon Interaction

    NASA Technical Reports Server (NTRS)

    Hunter, Craig A.; Thomas, Russell H.; Abdol-Hamid, K. S.; Pao, S. Paul; Elmiligui, Alaa A.; Massey, Steven J.

    2005-01-01

    Computational simulation and prediction tools were used to understand the jet-pylon interaction effect in a set of bypass-ratio five core/fan nozzles. Results suggest that the pylon acts as a large scale mixing vane that perturbs the jet flow and jump starts the jet mixing process. The enhanced mixing and associated secondary flows from the pylon result in a net increase of noise in the first 10 diameters of the jet s development, but there is a sustained reduction in noise from that point downstream. This is likely the reason the pylon nozzle is quieter overall than the baseline round nozzle in this case. The present work suggests that focused pylon design could lead to advanced pylon shapes and nozzle configurations that take advantage of propulsion-airframe integration to provide additional noise reduction capabilities.

  9. The Use of Computer Simulation Gaming in Teaching Broadcast Economics.

    ERIC Educational Resources Information Center

    Mancuso, Louis C.

    The purpose of this study was to develop a broadcast economic computer simulation and to ascertain how a lecture-computer simulation game compared as a teaching method with a more traditional lecture and case study instructional methods. In each of three sections of a broadcast economics course, a different teaching methodology was employed: (1)…

  10. The Effects of Inquiry-Based Computer Simulation with Cooperative Learning on Scientific Thinking and Conceptual Understanding of Gas Laws

    ERIC Educational Resources Information Center

    Abdullah, Sopiah; Shariff, Adilah

    2008-01-01

    The purpose of the study was to investigate the effects of inquiry-based computer simulation with heterogeneous-ability cooperative learning (HACL) and inquiry-based computer simulation with friendship cooperative learning (FCL) on (a) scientific reasoning (SR) and (b) conceptual understanding (CU) among Form Four students in Malaysian Smart…

  11. Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-01-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is…

  12. The Effect of Teacher Involvement on Student Performance in a Computer-Based Science Simulation.

    ERIC Educational Resources Information Center

    Waugh, Michael L.

    Designed to investigate whether or not science teachers can positively influence student achievement in, and attitude toward, science, this study focused on a specific teaching strategy and utilization of a computer-based simulation. The software package used in the study was the simulation, Volcanoes, by Earthware Computer Services. The sample…

  13. Introducing Molecular Life Science Students to Model Building Using Computer Simulations

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred

    2006-01-01

    Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…

  14. The Impact of Learner's Prior Knowledge on Their Use of Chemistry Computer Simulations: A Case Study

    ERIC Educational Resources Information Center

    Liu, Han-Chin; Andre, Thomas; Greenbowe, Thomas

    2008-01-01

    It is complicated to design a computer simulation that adapts to students with different characteristics. This study documented cases that show how college students' prior chemistry knowledge level affected their interaction with peers and their approach to solving problems with the use of computer simulations that were designed to learn…

  15. An Investigation of the Effectiveness of Computer Simulation Programs as Tutorial Tools for Teaching Population Ecology at University.

    ERIC Educational Resources Information Center

    Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.

    1999-01-01

    Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…

  16. Exploring the Perceptions of College Instructors towards Computer Simulation Software Programs: A Quantitative Study

    ERIC Educational Resources Information Center

    Punch, Raymond J.

    2012-01-01

    The purpose of the quantitative regression study was to explore and to identify relationships between attitudes toward use and perceptions of value of computer-based simulation programs, of college instructors, toward computer based simulation programs. A relationship has been reported between attitudes toward use and perceptions of the value of…

  17. Estimating long-term evolution of fine sediment budget in the Iffezheim reservoir using a simplified method based on classification of boundary conditions

    NASA Astrophysics Data System (ADS)

    Zhang, Qing; Hillebrand, Gudrun; Hoffmann, Thomas; Hinkelmann, Reinhard

    2017-04-01

    The Iffezheim reservoir is the last of a series of reservoirs on the Upper Rhine in Germany. Since its construction in 1977, approximately 115,000 m3 of fine sediments accumulate annually in the weir channel (WSA Freiburg, 2011). In order to obtain detailed information about the space-time development of the topography, the riverbed evolution was measured using echo sounding by the German Federal Waterways and Shipping Administration (WSV). 37 sets of sounding data, which have been obtained between July 2000 and February 2011, were used in this research. In a previous work, the morphodynamic processes in the Iffezheim reservoir were investigated using a high-resolution 3D model. The 3D computational fluid dynamic software SSIIM II (Olsen, 2014) was used for this purpose (Zhang et al., 2015). The model was calibrated using field measurements. A computational time of 14.5 hours, using 24 cores of a 2.4 GHz reference computer, was needed for simulating a period of three months on a grid of 238,013 cells. Thus, the long-term (e.g. 30 years) simulation of morphodynamics of the fine sediment budget in the Iffezheim reservoir with this model is not feasible. A low complexity approach of "classification of the boundary conditions of discharge and suspended sediment concentration" was applied in this research for a long-term numerical simulation. The basic idea of the approach is to replace instationary or quasi-steady simulations of deposition by a limited series of stationary ones. For these, daily volume changes were calculated considering representative discharge and concentration. Representative boundary conditions were determined by subdividing time series of discharge and concentration into classes and using central values per class. The amount of the deposition in the reservoir for a certain period can then be obtained by adding up the calculated daily depositions. This approach was applied to 10 short-term periods, between two successive echo sounding measurements, and 2 longer ones, which include several short-term periods. Short-term periods spread from 1 to 3 months, whereas long-term periods indicate 2 and 5 years. The simulation results showed an acceptable agreement with the measurements. It was also found that the long-term periods had less deviation to the measurements than the short ones. This simplified method exhibited clear savings in computational time compared to the instationary simulations; in this case only 3 hours of computational time were needed for 5 years simulation period using the reference computer mentioned above. Further research is needed with respect to the limits of this linear approach, i.e. with respect to the frequency with which the set of steady simulations has to be updated due to significant changes in morphology and in turn in hydraulics. Yet, the preliminary results are promising, suggesting that the developed approach is very suitable for a long-term simulation of riverbed evolution. REFERENCES Olsen, N.R.B. 2014. A three-dimensional numerical model for simulation of sediment movements in water intakes with multiblock option. Version 1 and 2. User's manual. Department of Hydraulic and Environmental Engineering. The Norwegian University of Science and Technology, Trondheim, Norway. Wasser- und Schifffahrtsamt (WSA) Freiburg. 2011. Sachstandsbericht oberer Wehrkanal Staustufe Iffezheim. Technical report - Upper weir channel of the Iffezheim hydropower reservoir. Zhang, Q., Hillebrand, G. Moser, H. & Hinkelmann, R. 2015. Simulation of non-uniform sediment transport in a German Reservoir with the SSIIM Model and sensitivity analysis. Proceedings of the 36th IAHR World Congress. The Hague, The Netherland.

  18. Using Computer Simulations in Drug Education Lessons.

    ERIC Educational Resources Information Center

    Bentz, Glenda D.

    1989-01-01

    Discussion of drug education for fifth grade students focuses on a computer simulation in which students role-play adolescents encountering various situations where there is drug or alcohol involvement. Activities in the simulation are explained, and discussion groups that occur following the simulation are described. (LRW)

  19. Multi-level emulation of a volcanic ash transport and dispersion model to quantify sensitivity to uncertain parameters

    NASA Astrophysics Data System (ADS)

    Harvey, Natalie J.; Huntley, Nathan; Dacre, Helen F.; Goldstein, Michael; Thomson, David; Webster, Helen

    2018-01-01

    Following the disruption to European airspace caused by the eruption of Eyjafjallajökull in 2010 there has been a move towards producing quantitative predictions of volcanic ash concentration using volcanic ash transport and dispersion simulators. However, there is no formal framework for determining the uncertainties of these predictions and performing many simulations using these complex models is computationally expensive. In this paper a Bayesian linear emulation approach is applied to the Numerical Atmospheric-dispersion Modelling Environment (NAME) to better understand the influence of source and internal model parameters on the simulator output. Emulation is a statistical method for predicting the output of a computer simulator at new parameter choices without actually running the simulator. A multi-level emulation approach is applied using two configurations of NAME with different numbers of model particles. Information from many evaluations of the computationally faster configuration is combined with results from relatively few evaluations of the slower, more accurate, configuration. This approach is effective when it is not possible to run the accurate simulator many times and when there is also little prior knowledge about the influence of parameters. The approach is applied to the mean ash column loading in 75 geographical regions on 14 May 2010. Through this analysis it has been found that the parameters that contribute the most to the output uncertainty are initial plume rise height, mass eruption rate, free tropospheric turbulence levels and precipitation threshold for wet deposition. This information can be used to inform future model development and observational campaigns and routine monitoring. The analysis presented here suggests the need for further observational and theoretical research into parameterisation of atmospheric turbulence. Furthermore it can also be used to inform the most important parameter perturbations for a small operational ensemble of simulations. The use of an emulator also identifies the input and internal parameters that do not contribute significantly to simulator uncertainty. Finally, the analysis highlights that the faster, less accurate, configuration of NAME can, on its own, provide useful information for the problem of predicting average column load over large areas.

  20. Evaluating data mining algorithms using molecular dynamics trajectories.

    PubMed

    Tatsis, Vasileios A; Tjortjis, Christos; Tzirakis, Panagiotis

    2013-01-01

    Molecular dynamics simulations provide a sample of a molecule's conformational space. Experiments on the mus time scale, resulting in large amounts of data, are nowadays routine. Data mining techniques such as classification provide a way to analyse such data. In this work, we evaluate and compare several classification algorithms using three data sets which resulted from computer simulations, of a potential enzyme mimetic biomolecule. We evaluated 65 classifiers available in the well-known data mining toolkit Weka, using 'classification' errors to assess algorithmic performance. Results suggest that: (i) 'meta' classifiers perform better than the other groups, when applied to molecular dynamics data sets; (ii) Random Forest and Rotation Forest are the best classifiers for all three data sets; and (iii) classification via clustering yields the highest classification error. Our findings are consistent with bibliographic evidence, suggesting a 'roadmap' for dealing with such data.

Top