Sample records for computer simulation helps

  1. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  2. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  3. The effectiveness of interactive computer simulations on college engineering student conceptual understanding and problem-solving ability related to circular motion

    NASA Astrophysics Data System (ADS)

    Chien, Cheng-Chih

    In the past thirty years, the effectiveness of computer assisted learning was found varied by individual studies. Today, with drastic technical improvement, computers have been widely spread in schools and used in a variety of ways. In this study, a design model involving educational technology, pedagogy, and content domain is proposed for effective use of computers in learning. Computer simulation, constructivist and Vygotskian perspectives, and circular motion are the three elements of the specific Chain Model for instructional design. The goal of the physics course is to help students remove the ideas which are not consistent with the physics community and rebuild new knowledge. To achieve the learning goal, the strategies of using conceptual conflicts and using language to internalize specific tasks into mental functions were included. Computer simulations and accompanying worksheets were used to help students explore their own ideas and to generate questions for discussions. Using animated images to describe the dynamic processes involved in the circular motion may reduce the complexity and possible miscommunications resulting from verbal explanations. The effectiveness of the instructional material on student learning is evaluated. The results of problem solving activities show that students using computer simulations had significantly higher scores than students not using computer simulations. For conceptual understanding, on the pretest students in the non-simulation group had significantly higher score than students in the simulation group. There was no significant difference observed between the two groups in the posttest. The relations of gender, prior physics experience, and frequency of computer uses outside the course to student achievement were also studied. There were fewer female students than male students and fewer students using computer simulations than students not using computer simulations. These characteristics affect the statistical power for detecting differences. For the future research, more intervention of simulations may be introduced to explore the potential of computer simulation in helping students learning. A test for conceptual understanding with more problems and appropriate difficulty level may be needed.

  4. Simulating Drosophila Genetics with the Computer.

    ERIC Educational Resources Information Center

    Small, James W., Jr.; Edwards, Kathryn L.

    1979-01-01

    Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

  5. The Ghost of Computers Past, Present, and Future: Computer Use for Preservice/Inservice Reading Programs.

    ERIC Educational Resources Information Center

    Prince, Amber T.

    Computer assisted instruction, and especially computer simulations, can help to ensure that preservice and inservice teachers learn from the right experiences. In the past, colleges of education used large mainframe computer systems to store student registration, provide simulation lessons on diagnosing reading difficulties, construct informal…

  6. When Feedback Harms and Collaboration Helps in Computer Simulation Environments: An Expertise Reversal Effect

    ERIC Educational Resources Information Center

    Nihalani, Priya K.; Mayrath, Michael; Robinson, Daniel H.

    2011-01-01

    We investigated the effects of feedback and collaboration on undergraduates' transfer performance when using a computer networking training simulation. In Experiment 1, 65 computer science "novices" worked through an instructional protocol individually (control), individually with feedback, or collaboratively with feedback. Unexpectedly,…

  7. Modelling and Simulation as a Recognizing Method in Education

    ERIC Educational Resources Information Center

    Stoffa, Veronika

    2004-01-01

    Computer animation-simulation models of complex processes and events, which are the method of instruction, can be an effective didactic device. Gaining deeper knowledge about objects modelled helps to plan simulation experiments oriented on processes and events researched. Animation experiments realized on multimedia computers can aid easier…

  8. Zero-gravity movement studies

    NASA Technical Reports Server (NTRS)

    Badler, N. I.; Fishwick, P.; Taft, N.; Agrawala, M.

    1985-01-01

    The use of computer graphics to simulate the movement of articulated animals and mechanisms has a number of uses ranging over many fields. Human motion simulation systems can be useful in education, medicine, anatomy, physiology, and dance. In biomechanics, computer displays help to understand and analyze performance. Simulations can be used to help understand the effect of external or internal forces. Similarly, zero-gravity simulation systems should provide a means of designing and exploring the capabilities of hypothetical zero-gravity situations before actually carrying out such actions. The advantage of using a simulation of the motion is that one can experiment with variations of a maneuver before attempting to teach it to an individual. The zero-gravity motion simulation problem can be divided into two broad areas: human movement and behavior in zero-gravity, and simulation of articulated mechanisms.

  9. BACLAB: A Computer Simulation of a Medical Bacteriology Laboratory--An Aid for Teaching Tertiary Level Microbiology.

    ERIC Educational Resources Information Center

    Lewington, J.; And Others

    1985-01-01

    Describes a computer simulation program which helps students learn the main biochemical tests and profiles for identifying medically important bacteria. Also discusses the advantages and applications of this type of approach. (ML)

  10. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  11. Taxis through Computer Simulation Programs.

    ERIC Educational Resources Information Center

    Park, David

    1983-01-01

    Describes a sequence of five computer programs (listings for Apple II available from author) on tactic responses (oriented movement of a cell, cell group, or whole organism in reponse to stimuli). The simulation programs are useful in helping students examine mechanisms at work in real organisms. (JN)

  12. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  13. The Role of Computer Simulation in Nanoporous Metals—A Review

    PubMed Central

    Xia, Re; Wu, Run Ni; Liu, Yi Lun; Sun, Xiao Yu

    2015-01-01

    Nanoporous metals (NPMs) have proven to be all-round candidates in versatile and diverse applications. In this decade, interest has grown in the fabrication, characterization and applications of these intriguing materials. Most existing reviews focus on the experimental and theoretical works rather than the numerical simulation. Actually, with numerous experiments and theory analysis, studies based on computer simulation, which may model complex microstructure in more realistic ways, play a key role in understanding and predicting the behaviors of NPMs. In this review, we present a comprehensive overview of the computer simulations of NPMs, which are prepared through chemical dealloying. Firstly, we summarize the various simulation approaches to preparation, processing, and the basic physical and chemical properties of NPMs. In this part, the emphasis is attached to works involving dealloying, coarsening and mechanical properties. Then, we conclude with the latest progress as well as the future challenges in simulation studies. We believe that highlighting the importance of simulations will help to better understand the properties of novel materials and help with new scientific research on these materials. PMID:28793491

  14. Promoting Transfer of Mathematics Skills through the Use of a Computer-Based Instructional Simulation Game and Advisement.

    ERIC Educational Resources Information Center

    Van Eck, Richard

    This study looked at the effect of contextual advisement and competition on transfer of mathematics skills in a computer-based instructional simulation game and simulation in which game participants helped their "aunt and uncle" fix up a house. Competition referred to whether or not the participant was playing against a computer…

  15. Designing Online Scaffolds for Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high school…

  16. Modeling Education on the Real World.

    ERIC Educational Resources Information Center

    Hunter, Beverly

    1983-01-01

    Discusses educational applications of computer simulation and model building for grades K to 8, with emphasis on the usefulness of the computer simulation language, micro-DYNAMO, for programing and understanding the models which help to explain social and natural phenomena. A new textbook for junior-senior high school students is noted. (EAO)

  17. Balancing Curricular and Pedagogical Needs in Computational Construction Kits: Lessons from the DeltaTick Project

    ERIC Educational Resources Information Center

    Wilkerson-Jerde, Michelle; Wagh, Aditi; Wilensky, Uri

    2015-01-01

    To successfully integrate simulation and computational methods into K-12 STEM education, learning environments should be designed to help educators maintain balance between (a) addressing curricular content and practices and (b) attending to student knowledge and interests. We describe DeltaTick, a graphical simulation construction interface for…

  18. An Experiment in the Use of Computer-Based Education to Teach Energy Considerations in Architectural Design.

    ERIC Educational Resources Information Center

    Arumi, Francisco N.

    Computer programs capable of describing the thermal behavior of buildings are used to help architectural students understand environmental systems. The Numerical Simulation Laboratory at the Architectural School of the University of Texas at Austin was developed to provide the necessary software capable of simulating the energy transactions…

  19. Teaching emergency medical services management skills using a computer simulation exercise.

    PubMed

    Hubble, Michael W; Richards, Michael E; Wilfong, Denise

    2011-02-01

    Simulation exercises have long been used to teach management skills in business schools. However, this pedagogical approach has not been reported in emergency medical services (EMS) management education. We sought to develop, deploy, and evaluate a computerized simulation exercise for teaching EMS management skills. Using historical data, a computer simulation model of a regional EMS system was developed. After validation, the simulation was used in an EMS management course. Using historical operational and financial data of the EMS system under study, students designed an EMS system and prepared a budget based on their design. The design of each group was entered into the model that simulated the performance of the EMS system. Students were evaluated on operational and financial performance of their system design and budget accuracy and then surveyed about their experiences with the exercise. The model accurately simulated the performance of the real-world EMS system on which it was based. The exercise helped students identify operational inefficiencies in their system designs and highlighted budget inaccuracies. Most students rated the exercise as moderately or very realistic in ambulance deployment scheduling, budgeting, personnel cost calculations, demand forecasting, system design, and revenue projections. All students indicated the exercise was helpful in gaining a top management perspective, and 89% stated the exercise was helpful in bridging the gap between theory and reality. Preliminary experience with a computer simulator to teach EMS management skills was well received by students in a baccalaureate paramedic program and seems to be a valuable teaching tool. Copyright © 2011 Society for Simulation in Healthcare

  20. Qualitative Assessment of a 3D Simulation Program: Faculty, Students, and Bio-Organic Reaction Animations

    ERIC Educational Resources Information Center

    Günersel, Adalet B.; Fleming, Steven A.

    2013-01-01

    Research shows that computer-based simulations and animations are especially helpful in fields such as chemistry where concepts are abstract and cannot be directly observed. Bio-Organic Reaction Animations (BioORA) is a freely available 3D visualization software program developed to help students understand the chemistry of biomolecular events.…

  1. Large Eddy Simulation of a Wind Turbine Airfoil at High Freestream-Flow Angle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2015-04-13

    A simulation of the airflow over a section of a wind turbine blade, run on the supercomputer Mira at the Argonne Leadership Computing Facility. Simulations like these help identify ways to make turbine blades more efficient.

  2. Large Eddy Simulation of a Wind Turbine Airfoil at High Freestream-Flow Angle

    ScienceCinema

    None

    2018-02-07

    A simulation of the airflow over a section of a wind turbine blade, run on the supercomputer Mira at the Argonne Leadership Computing Facility. Simulations like these help identify ways to make turbine blades more efficient.

  3. Improving a Computer Networks Course Using the Partov Simulation Engine

    ERIC Educational Resources Information Center

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  4. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  5. Developing iPad-Based Physics Simulations That Can Help People Learn Newtonian Physics Concepts

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    The aims of this study are: (1) to develop iPad-based computer simulations called iSimPhysics that can help people learn Newtonian physics concepts; and (2) to assess its educational benefits and pedagogical usefulness. To facilitate learning, iSimPhysics visualizes abstract physics concepts, and allows for conducting a series of computer…

  6. VERIFICATION OF THE HYDROLOGIC EVALUATION OF LANDFILL PERFORMANCE (HELP) MODEL USING FIELD DATA

    EPA Science Inventory

    The report describes a study conducted to verify the Hydrologic Evaluation of Landfill Performance (HELP) computer model using existing field data from a total of 20 landfill cells at 7 sites in the United States. Simulations using the HELP model were run to compare the predicted...

  7. Simulation with Python on transverse modes of the symmetric confocal resonator

    NASA Astrophysics Data System (ADS)

    Wang, Qing Hua; Qi, Jing; Ji, Yun Jing; Song, Yang; Li, Zhenhua

    2017-08-01

    Python is a popular open-source programming language that can be used to simulate various optical phenomena. We have developed a suite of programs to help teach the course of laser principle. The complicated transverse modes of the symmetric confocal resonator can be visualized in personal computers, which is significant to help the students understand the pattern distribution of laser resonator.

  8. Viscous computations of cold air/air flow around scramjet nozzle afterbody

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Engelund, Walter C.

    1991-01-01

    The flow field in and around the nozzle afterbody section of a hypersonic vehicle was computationally simulated. The compressible, Reynolds averaged, Navier Stokes equations were solved by an implicit, finite volume, characteristic based method. The computational grids were adapted to the flow as the solutions were developing in order to improve the accuracy. The exhaust gases were assumed to be cold. The computational results were obtained for the two dimensional longitudinal plane located at the half span of the internal portion of the nozzle for over expanded and under expanded conditions. Another set of results were obtained, where the three dimensional simulations were performed for a half span nozzle. The surface pressures were successfully compared with the data obtained from the wind tunnel tests. The results help in understanding this complex flow field and, in turn, should help the design of the nozzle afterbody section.

  9. SIMPLAS: A Simulation of Bacterial Plasmid Maintenance.

    ERIC Educational Resources Information Center

    Dunn, A.; And Others

    1988-01-01

    This article describes a computer simulation of bacterial physiology during growth in a chemostat. The program was designed to help students to appreciate and understand the related effects of parameters which influence plasmid persistence in bacterial populations. (CW)

  10. [Computer simulation by passenger wound analysis of vehicle collision].

    PubMed

    Zou, Dong-Hua; Liu, Nning-Guo; Shen, Jie; Zhang, Xiao-Yun; Jin, Xian-Long; Chen, Yi-Jiu

    2006-08-15

    To reconstruct the course of vehicle collision, so that to provide the reference for forensic identification and disposal of traffic accidents. Through analyzing evidences left both on passengers and vehicles, technique of momentum impulse combined with multi-dynamics was applied to simulate the motion and injury of passengers as well as the track of vehicles. Model of computer stimulation perfectly reconstructed phases of the traffic collision, which coincide with details found by forensic investigation. Computer stimulation is helpful and feasible for forensic identification in traffic accidents.

  11. The Energy-Environment Simulator as a Classroom Aid.

    ERIC Educational Resources Information Center

    Sell, Nancy J.; Van Koevering, Thomas E.

    1981-01-01

    Energy-Environment Simulators, provided by the U.S. Department of Energy, can be used to help individuals experience the effects of unbridled energy consumption for the next century on a national or worldwide scale. The simulator described is a specially designed analog computer which models the real-world energy situation. (MP)

  12. Building an intelligent tutoring system for procedural domains

    NASA Technical Reports Server (NTRS)

    Warinner, Andrew; Barbee, Diann; Brandt, Larry; Chen, Tom; Maguire, John

    1990-01-01

    Jobs that require complex skills that are too expensive or dangerous to develop often use simulators in training. The strength of a simulator is its ability to mimic the 'real world', allowing students to explore and experiment. A good simulation helps the student develop a 'mental model' of the real world. The closer the simulation is to 'real life', the less difficulties there are transferring skills and mental models developed on the simulator to the real job. As graphics workstations increase in power and become more affordable they become attractive candidates for developing computer-based simulations for use in training. Computer based simulations can make training more interesting and accessible to the student.

  13. Impulse-Momentum Diagrams

    ERIC Educational Resources Information Center

    Rosengrant, David

    2011-01-01

    Multiple representations are a valuable tool to help students learn and understand physics concepts. Furthermore, representations help students learn how to think and act like real scientists. These representations include: pictures, free-body diagrams, energy bar charts, electrical circuits, and, more recently, computer simulations and…

  14. Computer Simulation of Mutagenesis.

    ERIC Educational Resources Information Center

    North, J. C.; Dent, M. T.

    1978-01-01

    A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)

  15. Particle-In-Cell simulations of high pressure plasmas using graphics processing units

    NASA Astrophysics Data System (ADS)

    Gebhardt, Markus; Atteln, Frank; Brinkmann, Ralf Peter; Mussenbrock, Thomas; Mertmann, Philipp; Awakowicz, Peter

    2009-10-01

    Particle-In-Cell (PIC) simulations are widely used to understand the fundamental phenomena in low-temperature plasmas. Particularly plasmas at very low gas pressures are studied using PIC methods. The inherent drawback of these methods is that they are very time consuming -- certain stability conditions has to be satisfied. This holds even more for the PIC simulation of high pressure plasmas due to the very high collision rates. The simulations take up to very much time to run on standard computers and require the help of computer clusters or super computers. Recent advances in the field of graphics processing units (GPUs) provides every personal computer with a highly parallel multi processor architecture for very little money. This architecture is freely programmable and can be used to implement a wide class of problems. In this paper we present the concepts of a fully parallel PIC simulation of high pressure plasmas using the benefits of GPU programming.

  16. Making Water Pollution a Problem in the Classroom Through Computer Assisted Instruction.

    ERIC Educational Resources Information Center

    Flowers, John D.

    Alternative means for dealing with water pollution control are presented for students and teachers. One computer oriented program is described in terms of teaching wastewater treatment and pollution concepts to middle and secondary school students. Suggestions are given to help teachers use a computer simulation program in their classrooms.…

  17. [The characteristics of computer simulation of traffic accidents].

    PubMed

    Zou, Dong-Hua; Liu, Ning-Guo; Chen, Jian-Guo; Jin, Xian-Long; Zhang, Xiao-Yun; Zhang, Jian-Hua; Chen, Yi-Jiu

    2008-12-01

    To reconstruct the collision process of traffic accident and the injury mode of the victim by computer simulation technology in forensic assessment of traffic accident. Forty actual accidents were reconstructed by stimulation software and high performance computer based on analysis of the trace evidences at the scene, damage of the vehicles and injury of the victims, with 2 cases discussed in details. The reconstruction correlated very well in 28 cases, well in 9 cases, and suboptimal in 3 cases with the above parameters. Accurate reconstruction of the accident would be helpful for assessment of the injury mechanism of the victims. Reconstruction of the collision process of traffic accident and the injury mechanism of the victim by computer simulation is useful in traffic accident assessment.

  18. Computer-Based Simulation Systems and Role-Playing: An Effective Combination for Fostering Conditional Knowledge.

    ERIC Educational Resources Information Center

    Shlechter, Theodore M.; And Others

    1992-01-01

    Examines the effectiveness of SIMNET (Simulation Networking), a virtual reality training simulation system, combined with a program of role-playing activities for helping Army classes to master the conditional knowledge needed for successful field performance. The value of active forms of learning for promoting higher order cognitive thinking is…

  19. Models and Methods for Adaptive Management of Individual and Team-Based Training Using a Simulator

    NASA Astrophysics Data System (ADS)

    Lisitsyna, L. S.; Smetyuh, N. P.; Golikov, S. P.

    2017-05-01

    Research of adaptive individual and team-based training has been analyzed and helped find out that both in Russia and abroad, individual and team-based training and retraining of AASTM operators usually includes: production training, training of general computer and office equipment skills, simulator training including virtual simulators which use computers to simulate real-world manufacturing situation, and, as a rule, the evaluation of AASTM operators’ knowledge determined by completeness and adequacy of their actions under the simulated conditions. Such approach to training and re-training of AASTM operators stipulates only technical training of operators and testing their knowledge based on assessing their actions in a simulated environment.

  20. A computational model for simulating text comprehension.

    PubMed

    Lemaire, Benoît; Denhière, Guy; Bellissens, Cédrick; Jhean-Larose, Sandra

    2006-11-01

    In the present article, we outline the architecture of a computer program for simulating the process by which humans comprehend texts. The program is based on psycholinguistic theories about human memory and text comprehension processes, such as the construction-integration model (Kintsch, 1998), the latent semantic analysis theory of knowledge representation (Landauer & Dumais, 1997), and the predication algorithms (Kintsch, 2001; Lemaire & Bianco, 2003), and it is intended to help psycholinguists investigate the way humans comprehend texts.

  1. Computer Simulation For Design Of TWT's

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1992-01-01

    A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.

  2. Continuum Gyrokinetic Simulations of Turbulence in a Helical Model SOL with NSTX-type parameters

    NASA Astrophysics Data System (ADS)

    Hammett, G. W.; Shi, E. L.; Hakim, A.; Stoltzfus-Dueck, T.

    2017-10-01

    We have developed the Gkeyll code to carry out 3D2V full- F gyrokinetic simulations of electrostatic plasma turbulence in open-field-line geometries, using special versions of discontinuous-Galerkin algorithms to help with the computational challenges of the edge region. (Higher-order algorithms can also be helpful for exascale computing as they reduce the ratio of communications to computations.) Our first simulations with straight field lines were done for LAPD-type cases. Here we extend this to a helical model of an SOL plasma and show results for NSTX-type parameters. These simulations include the basic elements of a scrape-off layer: bad-curvature/interchange drive of instabilities, narrow sources to model plasma leaking from the core, and parallel losses with model sheath boundary conditions (our model allows currents to flow in and out of the walls). The formation of blobs is observed. By reducing the strength of the poloidal magnetic field, the heat flux at the divertor plate is observed to broaden. Supported by the Max-Planck/Princeton Center for Plasma Physics, the SciDAC Center for the Study of Plasma Microturbulence, and DOE Contract DE-AC02-09CH11466.

  3. Supporting Undergraduate Computer Architecture Students Using a Visual MIPS64 CPU Simulator

    ERIC Educational Resources Information Center

    Patti, D.; Spadaccini, A.; Palesi, M.; Fazzino, F.; Catania, V.

    2012-01-01

    The topics of computer architecture are always taught using an Assembly dialect as an example. The most commonly used textbooks in this field use the MIPS64 Instruction Set Architecture (ISA) to help students in learning the fundamentals of computer architecture because of its orthogonality and its suitability for real-world applications. This…

  4. Computer Literacy for Life Sciences: Helping the Digital-Era Biology Undergraduates Face Today's Research

    ERIC Educational Resources Information Center

    Smolinski, Tomasz G.

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…

  5. Ambient Assisted Living spaces validation by services and devices simulation.

    PubMed

    Fernández-Llatas, Carlos; Mocholí, Juan Bautista; Sala, Pilar; Naranjo, Juan Carlos; Pileggi, Salvatore F; Guillén, Sergio; Traver, Vicente

    2011-01-01

    The design of Ambient Assisted Living (AAL) products is a very demanding challenge. AAL products creation is a complex iterative process which must accomplish exhaustive prerequisites about accessibility and usability. In this process the early detection of errors is crucial to create cost-effective systems. Computer-assisted tools can suppose a vital help to usability designers in order to avoid design errors. Specifically computer simulation of products in AAL environments can be used in all the design phases to support the validation. In this paper, a computer simulation tool for supporting usability designers in the creation of innovative AAL products is presented. This application will benefit their work saving time and improving the final system functionality.

  6. Relation of Parallel Discrete Event Simulation algorithms with physical models

    NASA Astrophysics Data System (ADS)

    Shchur, L. N.; Shchur, L. V.

    2015-09-01

    We extend concept of local simulation times in parallel discrete event simulation (PDES) in order to take into account architecture of the current hardware and software in high-performance computing. We shortly review previous research on the mapping of PDES on physical problems, and emphasise how physical results may help to predict parallel algorithms behaviour.

  7. Games, Simulations, and Visual Metaphors in Education: Antagonism between Enjoyment and Learning

    ERIC Educational Resources Information Center

    Rieber, Lloyd P.; Noah, David

    2008-01-01

    The purpose of this study was to investigate the influence of game-like activities on adult learning during a computer-based simulation. This research also studied the use of visual metaphors as graphic organizers to help make the underlying science principles explicit without interfering with the interactive nature of the simulation. A total of…

  8. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  9. Development of Multimedia Computer Applications for Clinical Pharmacy Training.

    ERIC Educational Resources Information Center

    Schlict, John R.; Livengood, Bruce; Shepherd, John

    1997-01-01

    Computer simulations in clinical pharmacy education help expose students to clinical patient management earlier and enable training of large numbers of students outside conventional clinical practice sites. Multimedia instruction and its application to pharmacy training are described, the general process for developing multimedia presentations is…

  10. Conversations with AutoTutor Help Students Learn

    ERIC Educational Resources Information Center

    Graesser, Arthur C.

    2016-01-01

    AutoTutor helps students learn by holding a conversation in natural language. AutoTutor is adaptive to the learners' actions, verbal contributions, and in some systems their emotions. Many of AutoTutor's conversation patterns simulate human tutoring, but other patterns implement ideal pedagogies that open the door to computer tutors eclipsing…

  11. Fields of Fuel

    ERIC Educational Resources Information Center

    Russ, Rosemary S.; Wangen, Steve; Nye, D. Leith; Shapiro, R. Benjamin; Strinz, Will; Ferris, Michael

    2015-01-01

    To help teachers engage students in discussions about sustainability, the authors designed Fields of Fuel, a multiplayer, web-based simulation game that allows players to explore the environmental and economic trade-offs of a realistic sustainable system. Computer-based simulations of real-world phenomena engage students and have been shown to…

  12. University Macro Analytic Simulation Model.

    ERIC Educational Resources Information Center

    Baron, Robert; Gulko, Warren

    The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…

  13. Interactive Computation for Undergraduates: The Next Generation

    NASA Astrophysics Data System (ADS)

    Kolan, Amy J.

    2017-05-01

    A generation ago (29 years ago), Leo Kadanoff and Michael Vinson created the Computers, Chaos, and Physics course. A major pedagogical thrust of this course was to help students form and test hypotheses via computer simulation of small problems in physics. Recently, this aspect of the 1987 course has been revived for use with first year physics undergraduate students at St. Olaf College.

  14. Remediating Physics Misconceptions Using an Analogy-Based Computer Tutor. Draft.

    ERIC Educational Resources Information Center

    Murray, Tom; And Others

    Described is a computer tutor designed to help students gain a qualitative understanding of important physics concepts. The tutor simulates a teaching strategy called "bridging analogies" that previous research has demonstrated to be successful in one-on-one tutoring and written explanation studies. The strategy is designed to remedy…

  15. Using Computers for Research into Social Relations.

    ERIC Educational Resources Information Center

    Holden, George W.

    1988-01-01

    Discusses computer-presented social situations (CPSS), i.e., microcomputer-based simulations developed to provide a new methodological tool for social scientists interested in the study of social relations. Two CPSSs are described: DaySim, used to help identify types of parenting; and DateSim, used to study interpersonal attraction. (21…

  16. KINPLOT: An Interactive Pharmacokinetics Graphics Program for Digital Computers.

    ERIC Educational Resources Information Center

    Wilson, Robert C.; And Others

    1982-01-01

    Inability to see the relevance of mathematics to understanding the time course of drugs in the body may discourage interest in pharmacokinetics. A UNC-developed computer graphics simulation program helps visualize the nature of pharmacokinetic-patient interactions, generates classroom handouts, and is used in the pharmaceuticals industry to…

  17. Design and Performance Frameworks for Constructing Problem-Solving Simulations

    ERIC Educational Resources Information Center

    Stevens, Rons; Palacio-Cayetano, Joycelin

    2003-01-01

    Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks…

  18. Kennedy Space Center ITC-1 Internship Overview

    NASA Technical Reports Server (NTRS)

    Ni, Marcus

    2011-01-01

    As an intern for Priscilla Elfrey in the ITC-1 department, I was involved in many activities that have helped me to develop many new skills. I supported four different projects during my internship, which included the Center for Life Cycle Design (CfLCD), SISO Space Interoperability Smackdown, RTI Teacher Mentor Program, and the Discrete Event Simulation Integrated Visualization Environment Team (DIVE). I provided the CfLCD with web based research on cyber security initiatives involving simulation, education for young children, cloud computing, Otronicon, and Science, Technology, Engineering, and Mathematics (STEM) education initiatives. I also attended STEM meetings regarding simulation courses, and educational course enhancements. To further improve the SISO Simulation event, I provided observation feedback to the technical advisory board. I also helped to set up a chat federation for HLA. The third project involved the RTI Teacher Mentor program, which I helped to organize. Last, but not least, I worked with the DIVE team to develop new software to help visualize discrete event simulations. All of these projects have provided experience on an interdisciplinary level ranging from speech and communication to solving complex problems using math and science.

  19. Multi-scale modeling in cell biology

    PubMed Central

    Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick

    2009-01-01

    Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808

  20. Spacecraft orbit/earth scan derivations, associated APL program, and application to IMP-6

    NASA Technical Reports Server (NTRS)

    Smith, G. A.

    1971-01-01

    The derivation of a time shared, remote site, demand processed computer program is discussed. The computer program analyzes the effects of selected orbit, attitude, and spacecraft parameters on earth sensor detections of earth. For prelaunch analysis, the program may be used to simulate effects in nominal parameters which are used in preparing attitude data processing programs. After launch, comparison of results from a simulation and from satellite data will produce deviations helpful in isolating problems.

  1. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  2. A Teaching--Learning Sequence on Free Fall Motion

    ERIC Educational Resources Information Center

    Borghi, L.; De Ambrosis, A.; Lamberti, N.; Mascheretti, P.

    2005-01-01

    A teaching--learning sequence is presented that is designed to help high school pupils gain awareness about the independence of the vertical and horizontal components of free fall motion. The approach we propose is based on the use of experimental activities and computer simulations designed specifically to help pupils reflect on the experiments…

  3. A computational workflow for designing silicon donor qubits

    DOE PAGES

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; ...

    2016-09-19

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to performmore » detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. In conclusion, the resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.« less

  4. Ultrasonic Phased Array Inspection Experiments and Simulations for AN Isogrid Structural Element with Cracks

    NASA Astrophysics Data System (ADS)

    Roth, D. J.; Tokars, R. P.; Martin, R. E.; Rauser, R. W.; Aldrin, J. C.; Schumacher, E. J.

    2010-02-01

    In this investigation, a T-shaped aluminum alloy isogrid stiffener element used in aerospace applications was inspected with ultrasonic phased array methods. The isogrid stiffener element had various crack configurations emanating from bolt holes. Computational simulation methods were used to mimic the experiments in order to help understand experimental results. The results of this study indicate that it is at least partly feasible to interrogate this type of geometry with the given flaw configurations using phased array ultrasonics. The simulation methods were critical in helping explain the experimental results and, with some limitation, can be used to predict inspection results.

  5. Meso-scale Computational Investigation of Polyurea Microstructure and Its Role in Shockwave Attenuation/dispersion

    DTIC Science & Technology

    2015-07-01

    grained simulations of the formation of meso-segregated microstructure and its interaction with the shockwave is analyzed in the present work. It is...help identify these phenomena and processes, meso-scale coarse-grained simulations of the formation of meso-segregated microstructure and its...of shockwave-induced hard-domain densification. Keywords: Polyurea; Meso-scale; Coarse-grained simulations ; Shockwave attenuation; shockwave

  6. The SCEC/USGS dynamic earthquake rupture code verification exercise

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous rupture (hereafter called “spontaneous rupture”) solutions. For these types of numerical simulations, rather than prescribing the slip function at each location on the fault(s), just the friction constitutive properties and initial stress conditions are prescribed. The subsequent stresses and fault slip spontaneously evolve over time as part of the elasto-dynamic solution. Therefore, spontaneous rupture computer simulations of earthquakes allow us to include everything that we know, or think that we know, about earthquake dynamics and to test these ideas against earthquake observations.

  7. Simulating the fate of fall- and spring-applied poultry litter nitrogen in corn production

    USDA-ARS?s Scientific Manuscript database

    Monitoring the fate of N derived from manures applied to fertilize crops is difficult, time consuming, and relatively expensive. But computer simulation models can help understand the interactions among various N processes in the soil-plant system and determine the fate of applied N. The RZWQM2 was ...

  8. Genetic data simulators and their applications: an overview

    PubMed Central

    Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Gillanders, Elizabeth; Feuer, Eric J.

    2016-01-01

    Computer simulations have played an indispensable role in the development and application of statistical models and methods for genetic studies across multiple disciplines. The need to simulate complex evolutionary scenarios and pseudo-datasets for various studies has fueled the development of dozens of computer programs with varying reliability, performance, and application areas. To help researchers compare and choose the most appropriate simulators for their studies, we have created the Genetic Simulation Resources (GSR) website, which allows authors of simulation software to register their applications and describe them with more than 160 defined attributes. This article summarizes the properties of 93 simulators currently registered at GSR and provides an overview of the development and applications of genetic simulators. Unlike other review articles that address technical issues or compare simulators for particular application areas, we focus on software development, maintenance, and features of simulators, often from a historical perspective. Publications that cite these simulators are used to summarize both the applications of genetic simulations and the utilization of simulators. PMID:25504286

  9. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  10. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  11. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    NASA Astrophysics Data System (ADS)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  12. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  13. Understanding Cryptic Pocket Formation in Protein Targets by Enhanced Sampling Simulations.

    PubMed

    Oleinikovas, Vladimiras; Saladino, Giorgio; Cossins, Benjamin P; Gervasio, Francesco L

    2016-11-02

    Cryptic pockets, that is, sites on protein targets that only become apparent when drugs bind, provide a promising alternative to classical binding sites for drug development. Here, we investigate the nature and dynamical properties of cryptic sites in four pharmacologically relevant targets, while comparing the efficacy of various simulation-based approaches in discovering them. We find that the studied cryptic sites do not correspond to local minima in the computed conformational free energy landscape of the unliganded proteins. They thus promptly close in all of the molecular dynamics simulations performed, irrespective of the force-field used. Temperature-based enhanced sampling approaches, such as Parallel Tempering, do not improve the situation, as the entropic term does not help in the opening of the sites. The use of fragment probes helps, as in long simulations occasionally it leads to the opening and binding to the cryptic sites. Our observed mechanism of cryptic site formation is suggestive of an interplay between two classical mechanisms: induced-fit and conformational selection. Employing this insight, we developed a novel Hamiltonian Replica Exchange-based method "SWISH" (Sampling Water Interfaces through Scaled Hamiltonians), which combined with probes resulted in a promising general approach for cryptic site discovery. We also addressed the issue of "false-positives" and propose a simple approach to distinguish them from druggable cryptic pockets. Our simulations, whose cumulative sampling time was more than 200 μs, help in clarifying the molecular mechanism of pocket formation, providing a solid basis for the choice of an efficient computational method.

  14. Effects of Computer-Based Simulations Teaching Approach on Students' Achievement in the Learning of Chemistry among Secondary School Students in Nakuru Sub County, Kenya

    ERIC Educational Resources Information Center

    Mihindo, W. Jane; Wachanga, S.W.; Anditi, Z. O.

    2017-01-01

    Science education should help develop student's interest in science as today's society depends largely on output of science and technology. Chemistry is one of the branches of science. Chemistry education helps to expand the pupil's knowledge of the universe and of his/her position in it. It helps in the appreciation and enjoyment of nature and…

  15. Effectiveness of Dry Cell Microscopic Simulation (DCMS) to Promote Conceptual Understanding about Battery

    NASA Astrophysics Data System (ADS)

    Catur Wibowo, Firmanul; Suhandi, Andi; Rusdiana, Dadi; Samsudin, Achmad; Rahmi Darman, Dina; Faizin, M. Noor; Wiyanto; Supriyatman; Permanasari, Anna; Kaniawati, Ida; Setiawan, Wawan; Karyanto, Yudi; Linuwih, Suharto; Fatah, Abdul; Subali, Bambang; Hasani, Aceng; Hidayat, Sholeh

    2017-07-01

    Electricity is a concept that is abstract and difficult to see by eye directly, one example electric shock, but cannot see the movement of electric current so that students have difficulty by students. A computer simulation designed to improve the understanding of the concept of the workings of the dry cell (battery). This study was conducted to 82 students (aged 18-20 years) in the experimental group by learning to use the Dry Cell Microscopic Simulation (DCMS). The result shows the improving of students’ conceptual understanding scores from post test were statistically significantly of the workings of batteries. The implication using computer simulations designed to overcome the difficulties of conceptual understanding, can effectively help students in facilitating conceptual change.

  16. On-Track Testing as a Validation Method of Computational Fluid Dynamic Simulations of a Formula SAE Vehicle

    NASA Astrophysics Data System (ADS)

    Weingart, Robert

    This thesis is about the validation of a computational fluid dynamics simulation of a ground vehicle by means of a low-budget coast-down test. The vehicle is built to the standards of the 2014 Formula SAE rules. It is equipped with large wings in the front and rear of the car; the vertical loads on the tires are measured by specifically calibrated shock potentiometers. The coast-down test was performed on a runway of a local airport and is used to determine vehicle specific coefficients such as drag, downforce, aerodynamic balance, and rolling resistance for different aerodynamic setups. The test results are then compared to the respective simulated results. The drag deviates about 5% from the simulated to the measured results. The downforce numbers show a deviation up to 18% respectively. Moreover, a sensitivity analysis of inlet velocities, ride heights, and pitch angles was performed with the help of the computational simulation.

  17. 50 Years of Army Computing From ENIAC to MSRC

    DTIC Science & Technology

    2000-09-01

    processing capability. The scientifi c visualization program was started in 1984 to provide tools and expertise to help researchers graphically...and materials, forces modeling, nanoelectronics, electromagnetics and acoustics, signal image processing , and simulation and modeling. The ARL...mechanical and electrical calculating equipment, punch card data processing equipment, analog computers, and early digital machines. Before beginning, we

  18. Dynamic properties of epidemic spreading on finite size complex networks

    NASA Astrophysics Data System (ADS)

    Li, Ying; Liu, Yang; Shan, Xiu-Ming; Ren, Yong; Jiao, Jian; Qiu, Ben

    2005-11-01

    The Internet presents a complex topological structure, on which computer viruses can easily spread. By using theoretical analysis and computer simulation methods, the dynamic process of disease spreading on finite size networks with complex topological structure is investigated. On the finite size networks, the spreading process of SIS (susceptible-infected-susceptible) model is a finite Markov chain with an absorbing state. Two parameters, the survival probability and the conditional infecting probability, are introduced to describe the dynamic properties of disease spreading on finite size networks. Our results can help understanding computer virus epidemics and other spreading phenomena on communication and social networks. Also, knowledge about the dynamic character of virus spreading is helpful for adopting immunity policy.

  19. Using multi-criteria analysis of simulation models to understand complex biological systems

    Treesearch

    Maureen C. Kennedy; E. David Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  20. Advantages of Computer Simulation in Enhancing Students' Learning about Landform Evolution: A Case Study Using the Grand Canyon

    ERIC Educational Resources Information Center

    Luo, Wei; Pelletier, Jon; Duffin, Kirk; Ormand, Carol; Hung, Wei-chen; Shernoff, David J.; Zhai, Xiaoming; Iverson, Ellen; Whalley, Kyle; Gallaher, Courtney; Furness, Walter

    2016-01-01

    The long geological time needed for landform development and evolution poses a challenge for understanding and appreciating the processes involved. The Web-based Interactive Landform Simulation Model--Grand Canyon (WILSIM-GC, http://serc.carleton.edu/landform/) is an educational tool designed to help students better understand such processes,…

  1. Nonadiabatic holonomic quantum computation using Rydberg blockade

    NASA Astrophysics Data System (ADS)

    Kang, Yi-Hao; Chen, Ye-Hong; Shi, Zhi-Cheng; Huang, Bi-Hua; Song, Jie; Xia, Yan

    2018-04-01

    In this paper, we propose a scheme for realizing nonadiabatic holonomic computation assisted by two atoms and the shortcuts to adiabaticity (STA). The blockade effect induced by strong Rydberg-mediated interaction between two Rydberg atoms provides us the possibility to simplify the dynamics of the system, and the STA helps us design pulses for implementing the holonomic computation with high fidelity. Numerical simulations show the scheme is noise immune and decoherence resistant. Therefore, the current scheme may provide some useful perspectives for realizing nonadiabatic holonomic computation.

  2. Noise in Neuronal and Electronic Circuits: A General Modeling Framework and Non-Monte Carlo Simulation Techniques.

    PubMed

    Kilinc, Deniz; Demir, Alper

    2017-08-01

    The brain is extremely energy efficient and remarkably robust in what it does despite the considerable variability and noise caused by the stochastic mechanisms in neurons and synapses. Computational modeling is a powerful tool that can help us gain insight into this important aspect of brain mechanism. A deep understanding and computational design tools can help develop robust neuromorphic electronic circuits and hybrid neuroelectronic systems. In this paper, we present a general modeling framework for biological neuronal circuits that systematically captures the nonstationary stochastic behavior of ion channels and synaptic processes. In this framework, fine-grained, discrete-state, continuous-time Markov chain models of both ion channels and synaptic processes are treated in a unified manner. Our modeling framework features a mechanism for the automatic generation of the corresponding coarse-grained, continuous-state, continuous-time stochastic differential equation models for neuronal variability and noise. Furthermore, we repurpose non-Monte Carlo noise analysis techniques, which were previously developed for analog electronic circuits, for the stochastic characterization of neuronal circuits both in time and frequency domain. We verify that the fast non-Monte Carlo analysis methods produce results with the same accuracy as computationally expensive Monte Carlo simulations. We have implemented the proposed techniques in a prototype simulator, where both biological neuronal and analog electronic circuits can be simulated together in a coupled manner.

  3. Learning Reverse Engineering and Simulation with Design Visualization

    NASA Technical Reports Server (NTRS)

    Hemsworth, Paul J.

    2018-01-01

    The Design Visualization (DV) group supports work at the Kennedy Space Center by utilizing metrology data with Computer-Aided Design (CAD) models and simulations to provide accurate visual representations that aid in decision-making. The capability to measure and simulate objects in real time helps to predict and avoid potential problems before they become expensive in addition to facilitating the planning of operations. I had the opportunity to work on existing and new models and simulations in support of DV and NASA’s Exploration Ground Systems (EGS).

  4. Meet EPA Environmental Engineer Terra Haxton, Ph.D.

    EPA Pesticide Factsheets

    EPA Environmental Engineer Terra Haxton, Ph.D., uses computer simulation models to protect drinking water. She investigates approaches to help water utilities be better prepared to respond to contamination incidents in their distribution systems.

  5. Modeling a Wireless Network for International Space Station

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Yaprak, Ece; Lamouri, Saad

    2000-01-01

    This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.

  6. The viability of ADVANTG deterministic method for synthetic radiography generation

    NASA Astrophysics Data System (ADS)

    Bingham, Andrew; Lee, Hyoung K.

    2018-07-01

    Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.

  7. Ultrasonic Phased Array Inspection for an Isogrid Structural Element with Cracks

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Tokars, R. P.; Martin, R. E.; Rauser, R. W.; Aldrin, J. C.; Schumacher, E. J.

    2010-01-01

    In this investigation, a T-shaped aluminum alloy isogrid stiffener element used in aerospace applications was inspected with ultrasonic phased array methods. The isogrid stiffener element had various crack configurations emanating from bolt holes. Computational simulation methods were used to mimic the experiments in order to help understand experimental results. The results of this study indicate that it is at least partly feasible to interrogate this type of geometry with the given flaw configurations using phased array ultrasonics. The simulation methods were critical in helping explain the experimental results and, with some limitation, can be used to predict inspection results.

  8. Application of Psychological Theories in Agent-Based Modeling: The Case of the Theory of Planned Behavior.

    PubMed

    Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo

    2018-01-01

    It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.

  9. Evaluating Computer-Based Simulations, Multimedia and Animations that Help Integrate Blended Learning with Lectures in First Year Statistics

    ERIC Educational Resources Information Center

    Neumann, David L.; Neumann, Michelle M.; Hood, Michelle

    2011-01-01

    The discipline of statistics seems well suited to the integration of technology in a lecture as a means to enhance student learning and engagement. Technology can be used to simulate statistical concepts, create interactive learning exercises, and illustrate real world applications of statistics. The present study aimed to better understand the…

  10. Adaptive time steps in trajectory surface hopping simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spörkel, Lasse, E-mail: spoerkel@kofo.mpg.de; Thiel, Walter, E-mail: thiel@kofo.mpg.de

    2016-05-21

    Trajectory surface hopping (TSH) simulations are often performed in combination with active-space multi-reference configuration interaction (MRCI) treatments. Technical problems may arise in such simulations if active and inactive orbitals strongly mix and switch in some particular regions. We propose to use adaptive time steps when such regions are encountered in TSH simulations. For this purpose, we present a computational protocol that is easy to implement and increases the computational effort only in the critical regions. We test this procedure through TSH simulations of a GFP chromophore model (OHBI) and a light-driven rotary molecular motor (F-NAIBP) on semiempirical MRCI potential energymore » surfaces, by comparing the results from simulations with adaptive time steps to analogous ones with constant time steps. For both test molecules, the number of successful trajectories without technical failures rises significantly, from 53% to 95% for OHBI and from 25% to 96% for F-NAIBP. The computed excited-state lifetime remains essentially the same for OHBI and increases somewhat for F-NAIBP, and there is almost no change in the computed quantum efficiency for internal rotation in F-NAIBP. We recommend the general use of adaptive time steps in TSH simulations with active-space CI methods because this will help to avoid technical problems, increase the overall efficiency and robustness of the simulations, and allow for a more complete sampling.« less

  11. Adaptive time steps in trajectory surface hopping simulations

    NASA Astrophysics Data System (ADS)

    Spörkel, Lasse; Thiel, Walter

    2016-05-01

    Trajectory surface hopping (TSH) simulations are often performed in combination with active-space multi-reference configuration interaction (MRCI) treatments. Technical problems may arise in such simulations if active and inactive orbitals strongly mix and switch in some particular regions. We propose to use adaptive time steps when such regions are encountered in TSH simulations. For this purpose, we present a computational protocol that is easy to implement and increases the computational effort only in the critical regions. We test this procedure through TSH simulations of a GFP chromophore model (OHBI) and a light-driven rotary molecular motor (F-NAIBP) on semiempirical MRCI potential energy surfaces, by comparing the results from simulations with adaptive time steps to analogous ones with constant time steps. For both test molecules, the number of successful trajectories without technical failures rises significantly, from 53% to 95% for OHBI and from 25% to 96% for F-NAIBP. The computed excited-state lifetime remains essentially the same for OHBI and increases somewhat for F-NAIBP, and there is almost no change in the computed quantum efficiency for internal rotation in F-NAIBP. We recommend the general use of adaptive time steps in TSH simulations with active-space CI methods because this will help to avoid technical problems, increase the overall efficiency and robustness of the simulations, and allow for a more complete sampling.

  12. How the Geothermal Community Upped the Game for Computer Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The Geothermal Technologies Office Code Comparison Study brought 11 research institutions together to collaborate on coupled thermal, hydrologic, geomechanical, and geochemical numerical simulators. These codes have the potential to help facilitate widespread geothermal energy development.

  13. Demonstrating Newton's Third Law: Changing Aristotelian Viewpoints.

    ERIC Educational Resources Information Center

    Roach, Linda E.

    1992-01-01

    Suggests techniques to help eliminate students' misconceptions involving Newton's Third Law. Approaches suggested include teaching physics from a historical perspective, using computer programs with simulations, rewording the law, drawing free-body diagrams, and using demonstrations and examples. (PR)

  14. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  15. Activation pathway of Src kinase reveals intermediate states as novel targets for drug design

    PubMed Central

    Shukla, Diwakar; Meng, Yilin; Roux, Benoît; Pande, Vijay S.

    2014-01-01

    Unregulated activation of Src kinases leads to aberrant signaling, uncontrolled growth, and differentiation of cancerous cells. Reaching a complete mechanistic understanding of large scale conformational transformations underlying the activation of kinases could greatly help in the development of therapeutic drugs for the treatment of these pathologies. In principle, the nature of conformational transition could be modeled in silico via atomistic molecular dynamics simulations, although this is very challenging due to the long activation timescales. Here, we employ a computational paradigm that couples transition pathway techniques and Markov state model-based massively distributed simulations for mapping the conformational landscape of c-src tyrosine kinase. The computations provide the thermodynamics and kinetics of kinase activation for the first time, and help identify key structural intermediates. Furthermore, the presence of a novel allosteric site in an intermediate state of c-src that could be potentially utilized for drug design is predicted. PMID:24584478

  16. Multi-scale computation methods: Their applications in lithium-ion battery research and development

    NASA Astrophysics Data System (ADS)

    Siqi, Shi; Jian, Gao; Yue, Liu; Yan, Zhao; Qu, Wu; Wangwei, Ju; Chuying, Ouyang; Ruijuan, Xiao

    2016-01-01

    Based upon advances in theoretical algorithms, modeling and simulations, and computer technologies, the rational design of materials, cells, devices, and packs in the field of lithium-ion batteries is being realized incrementally and will at some point trigger a paradigm revolution by combining calculations and experiments linked by a big shared database, enabling accelerated development of the whole industrial chain. Theory and multi-scale modeling and simulation, as supplements to experimental efforts, can help greatly to close some of the current experimental and technological gaps, as well as predict path-independent properties and help to fundamentally understand path-independent performance in multiple spatial and temporal scales. Project supported by the National Natural Science Foundation of China (Grant Nos. 51372228 and 11234013), the National High Technology Research and Development Program of China (Grant No. 2015AA034201), and Shanghai Pujiang Program, China (Grant No. 14PJ1403900).

  17. The transesophageal echocardiography simulator based on computed tomography images.

    PubMed

    Piórkowski, Adam; Kempny, Aleksander

    2013-02-01

    Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.

  18. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  19. Multi-d CFD Modeling of a Free-piston Stirling Convertor at NASA Glenn

    NASA Technical Reports Server (NTRS)

    Wilson, Scott D.; Dyson, Rodger W.; Tew, Roy C.; Ibrahim, Mounir B.

    2004-01-01

    A high efficiency Stirling Radioisotope Generator (SRG) is being developed for possible use in long duration space science missions. NASA s advanced technology goals for next generation Stirling convertors include increasing the Carnot efficiency and percent of Carnot efficiency. To help achieve these goals, a multidimensional Computational Fluid Dynamics (CFD) code is being developed to numerically model unsteady fluid flow and heat transfer phenomena of the oscillating working gas inside Stirling convertors. Simulations of the Stirling convertors for the SRG will help characterize the thermodynamic losses resulting from fluid flow and heat transfer between the working gas and solid walls. The current CFD simulation represents approximated 2-dimensional convertor geometry. The simulation solves the Navier Stokes equations for an ideal helium gas oscillating at low speeds. The current simulation results are discussed.

  20. High-order hydrodynamic algorithms for exascale computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, Nathaniel Ray

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broadmore » range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.« less

  1. Development of the KOSMS management simulation training system and its application

    NASA Astrophysics Data System (ADS)

    Takatsu, Yoshiki

    The use of games which simulate actual corporate management has recently become more common and is now utilized in various ways for in-house corporate training courses. KOSMS (Kobe Steel Management Simulation System), a training system designed to help improve the management skills of senior management staff, is a unique management simulation training system in which the participants, using personal computers, must make decisions concerning a variety of management activities, in simulated competition with other corporations. This report outlines the KOSMS system, and describes the basic structure and detailed contents of the management simulation models, and actual application of the KOSMS management simulation training.

  2. Pain Assessment and Management in Nursing Education Using Computer-based Simulations.

    PubMed

    Romero-Hall, Enilda

    2015-08-01

    It is very important for nurses to have a clear understanding of the patient's pain experience and of management strategies. However, a review of the nursing literature shows that one of the main barriers to proper pain management practice is lack of knowledge. Nursing schools are in a unique position to address the gap in pain management knowledge by facilitating the acquisition and use of knowledge by the next generation of nurses. The purpose of this article is to discuss the role of computer-based simulations as a reliable educational technology strategy that can enhance the learning experience of nursing students acquiring pain management knowledge and practice. Computer-based simulations provide a significant number of learning affordances that can help change nursing students' attitudes and behaviors toward and practice of pain assessment and management. Copyright © 2015 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  3. Modeling aspects of human memory for scientific study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caudell, Thomas P.; Watson, Patrick; McDaniel, Mark A.

    Working with leading experts in the field of cognitive neuroscience and computational intelligence, SNL has developed a computational architecture that represents neurocognitive mechanisms associated with how humans remember experiences in their past. The architecture represents how knowledge is organized and updated through information from individual experiences (episodes) via the cortical-hippocampal declarative memory system. We compared the simulated behavioral characteristics with those of humans measured under well established experimental standards, controlling for unmodeled aspects of human processing, such as perception. We used this knowledge to create robust simulations of & human memory behaviors that should help move the scientific community closermore » to understanding how humans remember information. These behaviors were experimentally validated against actual human subjects, which was published. An important outcome of the validation process will be the joining of specific experimental testing procedures from the field of neuroscience with computational representations from the field of cognitive modeling and simulation.« less

  4. Breaking the Supermassive Black Hole Speed Limit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smidt, Joseph

    A new computer simulation helps explain the existence of puzzling supermassive black holes observed in the early universe. The simulation is based on a computer code used to understand the coupling of radiation and certain materials. “Supermassive black holes have a speed limit that governs how fast and how large they can grow,” said Joseph Smidt of the Theoretical Design Division at Los Alamos National Laboratory. “The relatively recent discovery of supermassive black holes in the early development of the universe raised a fundamental question, how did they get so big so fast?” Using computer codes developed at Los Alamosmore » for modeling the interaction of matter and radiation related to the Lab’s stockpile stewardship mission, Smidt and colleagues created a simulation of collapsing stars that resulted in supermassive black holes forming in less time than expected, cosmologically speaking, in the first billion years of the universe.« less

  5. Condor-COPASI: high-throughput computing for biochemical networks

    PubMed Central

    2012-01-01

    Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945

  6. Use of computer modeling to investigate a dynamic interaction problem in the Skylab TACS quad-valve package

    NASA Technical Reports Server (NTRS)

    Hesser, R. J.; Gershman, R.

    1975-01-01

    A valve opening-response problem encountered during development of a control valve for the Skylab thruster attitude control system (TACS) is described. The problem involved effects of dynamic interaction among valves in the quad-redundant valve package. Also described is a detailed computer simulation of the quad-valve package which was helpful in resolving the problem.

  7. Optical 1's and 2's complement devices using lithium-niobate-based waveguide

    NASA Astrophysics Data System (ADS)

    Pal, Amrindra; Kumar, Santosh; Sharma, Sandeep

    2016-12-01

    Optical 1's and 2's complement devices are proposed with the help of lithium-niobate-based Mach-Zehnder interferometers. It has a powerful capability of switching an optical signal from one port to the other port with the help of an electrical control signal. The paper includes the optical conversion scheme using sets of optical switches. 2's complement is common in computer systems and is used in binary subtraction and logical manipulation. The operation of the circuits is studied theoretically and analyzed through numerical simulations. The truth table of these complement methods is verified with the beam propagation method and MATLAB® simulation results.

  8. Hypersonic Combustor Model Inlet CFD Simulations and Experimental Comparisons

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; TokarcikPolsky, S.; Deiwert, G. S.; Edwards, Thomas A. (Technical Monitor)

    1995-01-01

    Numerous two-and three-dimensional computational simulations were performed for the inlet associated with the combustor model for the hypersonic propulsion experiment in the NASA Ames 16-Inch Shock Tunnel. The inlet was designed to produce a combustor-inlet flow that is nearly two-dimensional and of sufficient mass flow rate for large scale combustor testing. The three-dimensional simulations demonstrated that the inlet design met all the design objectives and that the inlet produced a very nearly two-dimensional combustor inflow profile. Numerous two-dimensional simulations were performed with various levels of approximations such as in the choice of chemical and physical models, as well as numerical approximations. Parametric studies were conducted to better understand and to characterize the inlet flow. Results from the two-and three-dimensional simulations were used to predict the mass flux entering the combustor and a mass flux correlation as a function of facility stagnation pressure was developed. Surface heat flux and pressure measurements were compared with the computed results and good agreement was found. The computational simulations helped determine the inlet low characteristics in the high enthalpy environment, the important parameters that affect the combustor-inlet flow, and the sensitivity of the inlet flow to various modeling assumptions.

  9. Simulating smokers' acceptance of modifications in a cessation program.

    PubMed Central

    Spoth, R

    1992-01-01

    Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed. PMID:1738813

  10. Simulating smokers' acceptance of modifications in a cessation program.

    PubMed

    Spoth, R

    1992-01-01

    Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed.

  11. Advanced Techniques for Simulating the Behavior of Sand

    NASA Astrophysics Data System (ADS)

    Clothier, M.; Bailey, M.

    2009-12-01

    Computer graphics and visualization techniques continue to provide untapped research opportunities, particularly when working with earth science disciplines. Through collaboration with the Oregon Space Grant and IGERT Ecosystem Informatics programs we are developing new techniques for simulating sand. In addition, through collaboration with the Oregon Space Grant, we’ve been communicating with the Jet Propulsion Laboratory (JPL) to exchange ideas and gain feedback on our work. More specifically, JPL’s DARTS Laboratory specializes in planetary vehicle simulation, such as the Mars rovers. This simulation utilizes a virtual "sand box" to test how planetary rovers respond to different terrains while traversing them. Unfortunately, this simulation is unable to fully mimic the harsh, sandy environments of those found on Mars. Ideally, these simulations should allow a rover to interact with the sand beneath it, particularly for different sand granularities and densities. In particular, there may be situations where a rover may become stuck in sand due to lack of friction between the sand and wheels. In fact, in May 2009, the Spirit rover became stuck in the Martian sand and has provided additional motivation for this research. In order to develop a new sand simulation model, high performance computing will play a very important role in this work. More specifically, graphics processing units (GPUs) are useful due to their ability to run general purpose algorithms and ability to perform massively parallel computations. In prior research, simulating vast quantities of sand has been difficult to compute in real-time due to the computational complexity of many colliding particles. With the use of GPUs however, each particle collision will be parallelized, allowing for a dramatic performance increase. In addition, spatial partitioning will also provide a speed boost as this will help limit the number of particle collision calculations. However, since the goal of this research is to simulate the look and behavior of sand, this work will go beyond simple particle collision. In particular, we can continue to use our parallel algorithms not only on single particles but on particle “clumps” that consist of multiple combined particles. Since sand is typically not spherical in nature, these particle “clumps” help to simulate the coarse nature of sand. In a simulation environment, multiple combined particles could be used to simulate the polygonal and granular nature of sand grains. Thus, a diversity of sand particles can be generated. The interaction between these particles can then be parallelized using GPU hardware. As such, this research will investigate different graphics and physics techniques and determine the tradeoffs in performance and visual quality for sand simulation. An enhanced sand model through the use of high performance computing and GPUs has great potential to impact research for both earth and space scientists. Interaction with JPL has provided an opportunity for us to refine our simulation techniques that can ultimately be used for their vehicle simulator. As an added benefit of this work, advancements in simulating sand can also benefit scientists here on earth, especially in regard to understanding landslides and debris flows.

  12. Investigating AI with BASIC and Logo: Helping the Computer to Understand INPUTS.

    ERIC Educational Resources Information Center

    Mandell, Alan; Lucking, Robert

    1988-01-01

    Investigates using the microcomputer to develop a sentence parser to simulate intelligent conversation used in artificial intelligence applications. Compares the ability of LOGO and BASIC for this use. Lists and critiques several LOGO and BASIC parser programs. (MVL)

  13. Computational techniques for ECG analysis and interpretation in light of their contribution to medical advances

    PubMed Central

    Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca

    2018-01-01

    Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268

  14. DSMC Simulations of Hypersonic Flows and Comparison With Experiments

    NASA Technical Reports Server (NTRS)

    Moss, James N.; Bird, Graeme A.; Markelov, Gennady N.

    2004-01-01

    This paper presents computational results obtained with the direct simulation Monte Carlo (DSMC) method for several biconic test cases in which shock interactions and flow separation-reattachment are key features of the flow. Recent ground-based experiments have been performed for several biconic configurations, and surface heating rate and pressure measurements have been proposed for code validation studies. The present focus is to expand on the current validating activities for a relatively new DSMC code called DS2V that Bird (second author) has developed. Comparisons with experiments and other computations help clarify the agreement currently being achieved between computations and experiments and to identify the range of measurement variability of the proposed validation data when benchmarked with respect to the current computations. For the test cases with significant vibrational nonequilibrium, the effect of the vibrational energy surface accommodation on heating and other quantities is demonstrated.

  15. Using high-fidelity computational fluid dynamics to help design a wind turbine wake measurement experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Churchfield, M.; Wang, Q.; Scholbrock, A.

    Here, we describe the process of using large-eddy simulations of wind turbine wake flow to help design a wake measurement campaign. The main goal of the experiment is to measure wakes and wake deflection that result from intentional yaw misalignment under a variety of atmospheric conditions at the Scaled Wind Farm Technology facility operated by Sandia National Laboratories in Lubbock, Texas. Prior simulation studies have shown that wake deflection may be used for wind-plant control that maximizes plant power output. In this study, simulations are performed to characterize wake deflection and general behavior before the experiment is performed to ensuremore » better upfront planning. Beyond characterizing the expected wake behavior, we also use the large-eddy simulation to test a virtual version of the lidar we plan to use to measure the wake and better understand our lidar scan strategy options. This work is an excellent example of a 'simulation-in-the-loop' measurement campaign.« less

  16. Using High-Fidelity Computational Fluid Dynamics to Help Design a Wind Turbine Wake Measurement Experiment

    NASA Astrophysics Data System (ADS)

    Churchfield, M.; Wang, Q.; Scholbrock, A.; Herges, T.; Mikkelsen, T.; Sjöholm, M.

    2016-09-01

    We describe the process of using large-eddy simulations of wind turbine wake flow to help design a wake measurement campaign. The main goal of the experiment is to measure wakes and wake deflection that result from intentional yaw misalignment under a variety of atmospheric conditions at the Scaled Wind Farm Technology facility operated by Sandia National Laboratories in Lubbock, Texas. Prior simulation studies have shown that wake deflection may be used for wind-plant control that maximizes plant power output. In this study, simulations are performed to characterize wake deflection and general behavior before the experiment is performed to ensure better upfront planning. Beyond characterizing the expected wake behavior, we also use the large-eddy simulation to test a virtual version of the lidar we plan to use to measure the wake and better understand our lidar scan strategy options. This work is an excellent example of a “simulation-in-the-loop” measurement campaign.

  17. Using high-fidelity computational fluid dynamics to help design a wind turbine wake measurement experiment

    DOE PAGES

    Churchfield, M.; Wang, Q.; Scholbrock, A.; ...

    2016-10-03

    Here, we describe the process of using large-eddy simulations of wind turbine wake flow to help design a wake measurement campaign. The main goal of the experiment is to measure wakes and wake deflection that result from intentional yaw misalignment under a variety of atmospheric conditions at the Scaled Wind Farm Technology facility operated by Sandia National Laboratories in Lubbock, Texas. Prior simulation studies have shown that wake deflection may be used for wind-plant control that maximizes plant power output. In this study, simulations are performed to characterize wake deflection and general behavior before the experiment is performed to ensuremore » better upfront planning. Beyond characterizing the expected wake behavior, we also use the large-eddy simulation to test a virtual version of the lidar we plan to use to measure the wake and better understand our lidar scan strategy options. This work is an excellent example of a 'simulation-in-the-loop' measurement campaign.« less

  18. Computer Simulation Shows the Effect of Communication on Day of Surgery Patient Flow.

    PubMed

    Taaffe, Kevin; Fredendall, Lawrence; Huynh, Nathan; Franklin, Jennifer

    2015-07-01

    To improve patient flow in a surgical environment, practitioners and academicians often use process mapping and simulation as tools to evaluate and recommend changes. We used simulations to help staff visualize the effect of communication and coordination delays that occur on the day of surgery. Perioperative services staff participated in tabletop exercises in which they chose the delays that were most important to eliminate. Using a day-of-surgery computer simulation model, the elimination of delays was tested and the results were shared with the group. This exercise, repeated for multiple groups of staff, provided an understanding of not only the dynamic events taking place, but also how small communication delays can contribute to a significant loss in efficiency and the ability to provide timely care. Survey results confirmed these understandings. Copyright © 2015 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  19. Neurophysiological model of the normal and abnormal human pupil

    NASA Technical Reports Server (NTRS)

    Krenz, W.; Robin, M.; Barez, S.; Stark, L.

    1985-01-01

    Anatomical, experimental, and computer simulation studies were used to determine the structure of the neurophysiological model of the pupil size control system. The computer simulation of this model demonstrates the role played by each of the elements in the neurological pathways influencing the size of the pupil. Simulations of the effect of drugs and common abnormalities in the system help to illustrate the workings of the pathways and processes involved. The simulation program allows the user to select pupil condition (normal or an abnormality), specific site along the neurological pathway (retina, hypothalamus, etc.) drug class input (barbiturate, narcotic, etc.), stimulus/response mode, display mode, stimulus type and input waveform, stimulus or background intensity and frequency, the input and output conditions, and the response at the neuroanatomical site. The model can be used as a teaching aid or as a tool for testing hypotheses regarding the system.

  20. Gravity Behaves Like That?

    NASA Astrophysics Data System (ADS)

    Pazmino, John

    2007-02-01

    Many concepts of chaotic action in astrodynamics can be appreciated through simulations with home computers and software. Many astrodynamical cases are illustrated. Although chaos theory is now applied to spaceflight trajectories, this presentation employs only inert bodies with no onboard impulse, e.g., from rockets or outgassing. Other nongravitational effects are also ignored, such as atmosphere drag, solar pressure, and radiation. The ability to simulate gravity behavior, even if not completely rigorous, on small mass-market computers allows a fuller understanding of the new approach to astrodynamics by home astronomers, scientists outside orbital mechanics, and students in middle and high school. The simulations can also help a lay audience visualize gravity behavior during press conferences, briefings, and public lectures. No review, evaluation, critique of the programs shown in this presentation is intended. The results from these simulations are not valid for - and must not be used for - making earth-colliding predictions.

  1. Biochemical simulations: stochastic, approximate stochastic and hybrid approaches.

    PubMed

    Pahle, Jürgen

    2009-01-01

    Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem.

  2. Biochemical simulations: stochastic, approximate stochastic and hybrid approaches

    PubMed Central

    2009-01-01

    Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem. PMID:19151097

  3. Problem-Solving Rules for Genetics.

    ERIC Educational Resources Information Center

    Collins, Angelo

    The categories and applications of strategic knowledge as these relate to problem solving in the area of transmission genetics are examined in this research study. The role of computer simulations in helping students acquire the strategic knowledge necessary to solve realistic transmission genetics problems was emphasized. The Genetics…

  4. Science. [SITE 2001 Section].

    ERIC Educational Resources Information Center

    Roach, Linda E., Ed.

    This document contains the following papers on science from the SITE (Society for Information Technology & Teacher Education) 2001 conference: (1) "Using a Computer Simulation before Dissection To Help Students Learn Anatomy" (Joseph Paul Akpan and Thomas Andre); (2) "EARTH2CLASS: A Unique Workshop/On-Line/Distance-Learning…

  5. A programming language for composable DNA circuits

    PubMed Central

    Phillips, Andrew; Cardelli, Luca

    2009-01-01

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing. PMID:19535415

  6. A programming language for composable DNA circuits.

    PubMed

    Phillips, Andrew; Cardelli, Luca

    2009-08-06

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing.

  7. All-optical 4-bit binary to binary coded decimal converter with the help of semiconductor optical amplifier-assisted Sagnac switch

    NASA Astrophysics Data System (ADS)

    Bhattachryya, Arunava; Kumar Gayen, Dilip; Chattopadhyay, Tanay

    2013-04-01

    All-optical 4-bit binary to binary coded decimal (BCD) converter has been proposed and described, with the help of semiconductor optical amplifier (SOA)-assisted Sagnac interferometric switches in this manuscript. The paper describes all-optical conversion scheme using a set of all-optical switches. BCD is common in computer systems that display numeric values, especially in those consisting solely of digital logic with no microprocessor. In many personal computers, the basic input/output system (BIOS) keep the date and time in BCD format. The operations of the circuit are studied theoretically and analyzed through numerical simulations. The model accounts for the SOA small signal gain, line-width enhancement factor and carrier lifetime, the switching pulse energy and width, and the Sagnac loop asymmetry. By undertaking a detailed numerical simulation the influence of these key parameters on the metrics that determine the quality of switching is thoroughly investigated.

  8. Long-range interactions and parallel scalability in molecular simulations

    NASA Astrophysics Data System (ADS)

    Patra, Michael; Hyvönen, Marja T.; Falck, Emma; Sabouri-Ghomi, Mohsen; Vattulainen, Ilpo; Karttunen, Mikko

    2007-01-01

    Typical biomolecular systems such as cellular membranes, DNA, and protein complexes are highly charged. Thus, efficient and accurate treatment of electrostatic interactions is of great importance in computational modeling of such systems. We have employed the GROMACS simulation package to perform extensive benchmarking of different commonly used electrostatic schemes on a range of computer architectures (Pentium-4, IBM Power 4, and Apple/IBM G5) for single processor and parallel performance up to 8 nodes—we have also tested the scalability on four different networks, namely Infiniband, GigaBit Ethernet, Fast Ethernet, and nearly uniform memory architecture, i.e. communication between CPUs is possible by directly reading from or writing to other CPUs' local memory. It turns out that the particle-mesh Ewald method (PME) performs surprisingly well and offers competitive performance unless parallel runs on PC hardware with older network infrastructure are needed. Lipid bilayers of sizes 128, 512 and 2048 lipid molecules were used as the test systems representing typical cases encountered in biomolecular simulations. Our results enable an accurate prediction of computational speed on most current computing systems, both for serial and parallel runs. These results should be helpful in, for example, choosing the most suitable configuration for a small departmental computer cluster.

  9. Comparisons of some large scientific computers

    NASA Technical Reports Server (NTRS)

    Credeur, K. R.

    1981-01-01

    In 1975, the National Aeronautics and Space Administration (NASA) began studies to assess the technical and economic feasibility of developing a computer having sustained computational speed of one billion floating point operations per second and a working memory of at least 240 million words. Such a powerful computer would allow computational aerodynamics to play a major role in aeronautical design and advanced fluid dynamics research. Based on favorable results from these studies, NASA proceeded with developmental plans. The computer was named the Numerical Aerodynamic Simulator (NAS). To help insure that the estimated cost, schedule, and technical scope were realistic, a brief study was made of past large scientific computers. Large discrepancies between inception and operation in scope, cost, or schedule were studied so that they could be minimized with NASA's proposed new compter. The main computers studied were the ILLIAC IV, STAR 100, Parallel Element Processor Ensemble (PEPE), and Shuttle Mission Simulator (SMS) computer. Comparison data on memory and speed were also obtained on the IBM 650, 704, 7090, 360-50, 360-67, 360-91, and 370-195; the CDC 6400, 6600, 7600, CYBER 203, and CYBER 205; CRAY 1; and the Advanced Scientific Computer (ASC). A few lessons learned conclude the report.

  10. Forecasting techno-social systems: how physics and computing help to fight off global pandemics

    NASA Astrophysics Data System (ADS)

    Vespignani, Alessandro

    2010-03-01

    The crucial issue when planning for adequate public health interventions to mitigate the spread and impact of epidemics is risk evaluation and forecast. This amount to the anticipation of where, when and how strong the epidemic will strike. In the last decade advances in performance in computer technology, data acquisition, statistical physics and complex networks theory allow the generation of sophisticated simulations on supercomputer infrastructures to anticipate the spreading pattern of a pandemic. For the first time we are in the position of generating real time forecast of epidemic spreading. I will review the history of the current H1N1 pandemic, the major road-blocks the community has faced in its containment and mitigation and how physics and computing provide predictive tools that help us to battle epidemics.

  11. Runtime visualization of the human arterial tree.

    PubMed

    Insley, Joseph A; Papka, Michael E; Dong, Suchuan; Karniadakis, George; Karonis, Nicholas T

    2007-01-01

    Large-scale simulation codes typically execute for extended periods of time and often on distributed computational resources. Because these simulations can run for hours, or even days, scientists like to get feedback about the state of the computation and the validity of its results as it runs. It is also important that these capabilities be made available with little impact on the performance and stability of the simulation. Visualizing and exploring data in the early stages of the simulation can help scientists identify problems early, potentially avoiding a situation where a simulation runs for several days, only to discover that an error with an input parameter caused both time and resources to be wasted. We describe an application that aids in the monitoring and analysis of a simulation of the human arterial tree. The application provides researchers with high-level feedback about the state of the ongoing simulation and enables them to investigate particular areas of interest in greater detail. The application also offers monitoring information about the amount of data produced and data transfer performance among the various components of the application.

  12. The ReaxFF reactive force-field: Development, applications, and future directions

    DOE PAGES

    Senftle, Thomas; Hong, Sungwook; Islam, Md Mahbubul; ...

    2016-03-04

    The reactive force-field (ReaxFF) interatomic potential is a powerful computational tool for exploring, developing and optimizing material properties. Methods based on the principles of quantum mechanics (QM), while offering valuable theoretical guidance at the electronic level, are often too computationally intense for simulations that consider the full dynamic evolution of a system. Alternatively, empirical interatomic potentials that are based on classical principles require significantly fewer computational resources, which enables simulations to better describe dynamic processes over longer timeframes and on larger scales. Such methods, however, typically require a predefined connectivity between atoms, precluding simulations that involve reactive events. The ReaxFFmore » method was developed to help bridge this gap. Approaching the gap from the classical side, ReaxFF casts the empirical interatomic potential within a bond-order formalism, thus implicitly describing chemical bonding without expensive QM calculations. As a result, this article provides an overview of the development, application, and future directions of the ReaxFF method.« less

  13. Engaging Undergraduate Math Majors in Geoscience Research using Interactive Simulations and Computer Art

    NASA Astrophysics Data System (ADS)

    Matott, L. S.; Hymiak, B.; Reslink, C. F.; Baxter, C.; Aziz, S.

    2012-12-01

    As part of the NSF-sponsored 'URGE (Undergraduate Research Group Experiences) to Compute' program, Dr. Matott has been collaborating with talented Math majors to explore the design of cost-effective systems to safeguard groundwater supplies from contaminated sites. Such activity is aided by a combination of groundwater modeling, simulation-based optimization, and high-performance computing - disciplines largely unfamiliar to the students at the outset of the program. To help train and engage the students, a number of interactive and graphical software packages were utilized. Examples include: (1) a tutorial for exploring the behavior of evolutionary algorithms and other heuristic optimizers commonly used in simulation-based optimization; (2) an interactive groundwater modeling package for exploring alternative pump-and-treat containment scenarios at a contaminated site in Billings, Montana; (3) the R software package for visualizing various concepts related to subsurface hydrology; and (4) a job visualization tool for exploring the behavior of numerical experiments run on a large distributed computing cluster. Further engagement and excitement in the program was fostered by entering (and winning) a computer art competition run by the Coalition for Academic Scientific Computation (CASC). The winning submission visualizes an exhaustively mapped optimization cost surface and dramatically illustrates the phenomena of artificial minima - valley locations that correspond to designs whose costs are only partially optimal.

  14. A Wireless Communications Systems Laboratory Course

    ERIC Educational Resources Information Center

    Guzelgoz, Sabih; Arslan, Huseyin

    2010-01-01

    A novel wireless communications systems laboratory course is introduced. The course teaches students how to design, test, and simulate wireless systems using modern instrumentation and computer-aided design (CAD) software. One of the objectives of the course is to help students understand the theoretical concepts behind wireless communication…

  15. A New Java Animation in Peer-Reviewed "JCE" Webware

    ERIC Educational Resources Information Center

    Coleman, William F.; Fedosky, Edward W.

    2006-01-01

    "Computer Simulations of Salt Solubility" by Victor M. S. Gil provides an animated, visual interpretation of the different solubilities of related salts based on simple entropy changes associated with dissolution such as configurational disorder and thermal disorder. This animation can help improve students' conceptual understanding of…

  16. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  17. Using Multistate Reweighting to Rapidly and Efficiently Explore Molecular Simulation Parameters Space for Nonbonded Interactions.

    PubMed

    Paliwal, Himanshu; Shirts, Michael R

    2013-11-12

    Multistate reweighting methods such as the multistate Bennett acceptance ratio (MBAR) can predict free energies and expectation values of thermodynamic observables at poorly sampled or unsampled thermodynamic states using simulations performed at only a few sampled states combined with single point energy reevaluations of these samples at the unsampled states. In this study, we demonstrate the power of this general reweighting formalism by exploring the effect of simulation parameters controlling Coulomb and Lennard-Jones cutoffs on free energy calculations and other observables. Using multistate reweighting, we can quickly identify, with very high sensitivity, the computationally least expensive nonbonded parameters required to obtain a specified accuracy in observables compared to the answer obtained using an expensive "gold standard" set of parameters. We specifically examine free energy estimates of three molecular transformations in a benchmark molecular set as well as the enthalpy of vaporization of TIP3P. The results demonstrates the power of this multistate reweighting approach for measuring changes in free energy differences or other estimators with respect to simulation or model parameters with very high precision and/or very low computational effort. The results also help to identify which simulation parameters affect free energy calculations and provide guidance to determine which simulation parameters are both appropriate and computationally efficient in general.

  18. Visualizing ultrasound through computational modeling

    NASA Technical Reports Server (NTRS)

    Guo, Theresa W.

    2004-01-01

    The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.

  19. Fast image-based mitral valve simulation from individualized geometry.

    PubMed

    Villard, Pierre-Frederic; Hammer, Peter E; Perrin, Douglas P; Del Nido, Pedro J; Howe, Robert D

    2018-04-01

    Common surgical procedures on the mitral valve of the heart include modifications to the chordae tendineae. Such interventions are used when there is extensive leaflet prolapse caused by chordae rupture or elongation. Understanding the role of individual chordae tendineae before operating could be helpful to predict whether the mitral valve will be competent at peak systole. Biomechanical modelling and simulation can achieve this goal. We present a method to semi-automatically build a computational model of a mitral valve from micro CT (computed tomography) scans: after manually picking chordae fiducial points, the leaflets are segmented and the boundary conditions as well as the loading conditions are automatically defined. Fast finite element method (FEM) simulation is carried out using Simulation Open Framework Architecture (SOFA) to reproduce leaflet closure at peak systole. We develop three metrics to evaluate simulation results: (i) point-to-surface error with the ground truth reference extracted from the CT image, (ii) coaptation surface area of the leaflets and (iii) an indication of whether the simulated closed leaflets leak. We validate our method on three explanted porcine hearts and show that our model predicts the closed valve surface with point-to-surface error of approximately 1 mm, a reasonable coaptation surface area, and absence of any leak at peak systole (maximum closed pressure). We also evaluate the sensitivity of our model to changes in various parameters (tissue elasticity, mesh accuracy, and the transformation matrix used for CT scan registration). We also measure the influence of the positions of the chordae tendineae on simulation results and show that marginal chordae have a greater influence on the final shape than intermediate chordae. The mitral valve simulation can help the surgeon understand valve behaviour and anticipate the outcome of a procedure. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Coupling MD Simulations and X-ray Absorption Spectroscopy to Study Ions in Solution

    NASA Astrophysics Data System (ADS)

    Marcos, E. Sánchez; Beret, E. C.; Martínez, J. M.; Pappalardo, R. R.; Ayala, R.; Muñoz-Páez, A.

    2007-12-01

    The structure of ionic solutions is a key-point in understanding physicochemical properties of electrolyte solutions. Among the reduced number of experimental techniques which can supply direct information on the ion environment, X-ray Absorption techniques (XAS) have gained importance during the last decades although they are not free of difficulties associated to the data analysis leading to provide reliable structures. Computer simulations of ions in solution is a theoretical alternative to provide information on the solvation structure. Thus, the use of computational chemistry can increase the understanding of these systems although an accurate description of ionic solvation phenomena represents nowadays a significant challenge to theoretical chemistry. We present: (a) the assignment of features in the XANES spectrum to well defined structural motif in the ion environment, (b) MD-based evaluation of EXAFS parameters used in the fitting procedure to make easier the structural resolution, and (c) the use of the agreement between experimental and simulated XANES spectra to help in the choice of a given intermolecular potential for Computer Simulations. Chemical problems examined are: (a) the identification of the second hydration shell in dilute aqueous solutions of highly-charged cations, such as Cr3+, Rh3+, Ir3+, (b) the invisibility by XAS of certain structures characterized by Computer Simulations but exhibiting high dynamical behavior and (c) the solvation of Br- in acetonitrile.

  1. Coupling MD Simulations and X-ray Absorption Spectroscopy to Study Ions in Solution

    NASA Astrophysics Data System (ADS)

    Marcos, E. Sánchez; Beret, E. C.; Martínez, J. M.; Pappalardo, R. R.; Ayala, R.; Muñoz-Páez, A.

    2007-11-01

    The structure of ionic solutions is a key-point in understanding physicochemical properties of electrolyte solutions. Among the reduced number of experimental techniques which can supply direct information on the ion environment, X-ray Absorption techniques (XAS) have gained importance during the last decades although they are not free of difficulties associated to the data analysis leading to provide reliable structures. Computer simulations of ions in solution is a theoretical alternative to provide information on the solvation structure. Thus, the use of computational chemistry can increase the understanding of these systems although an accurate description of ionic solvation phenomena represents nowadays a significant challenge to theoretical chemistry. We present: (a) the assignment of features in the XANES spectrum to well defined structural motif in the ion environment, (b) MD-based evaluation of EXAFS parameters used in the fitting procedure to make easier the structural resolution, and (c) the use of the agreement between experimental and simulated XANES spectra to help in the choice of a given intermolecular potential for Computer Simulations. Chemical problems examined are: (a) the identification of the second hydration shell in dilute aqueous solutions of highly-charged cations, such as Cr3+, Rh3+, Ir3+, (b) the invisibility by XAS of certain structures characterized by Computer Simulations but exhibiting high dynamical behavior and (c) the solvation of Br- in acetonitrile.

  2. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    PubMed Central

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H.A.; Hlavacek, William S.; Posner, Richard G.

    2016-01-01

    Summary: Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. Availability and implementation: BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary information: Supplementary data are available at Bioinformatics online. Contact: bionetgen.help@gmail.com PMID:26556387

  3. Computer simulation of the human respiratory system for educational purposes.

    PubMed

    Botsis, Taxiarhis; Halkiotis, Stelios-Chris; Kourlaba, Georgia

    2004-01-01

    The main objective of this study was the development of a computer simulation system for the human respiratory system, in order to educate students of nursing. This approach was based on existing mathematical models and on our own constructed specific functions. For the development of this educational tool the appropriate software packages were used according to the special demands of this process. This system is called ReSim (Respiratory Simulation) and consists of two parts: the first part deals with pulmonary volumes and the second one represents the mechanical behavior of lungs. The target group evaluated ReSim. The outcomes of the evaluation process were positive and helped us realize the system characteristics that needed improvements. Our basic conclusion is that the extended use of such systems supports the educational process and offers new potential for learning.

  4. DEVELOPING A CAPE-OPEN COMPLIANT METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (CO-MFFP2T)

    EPA Science Inventory

    The USEPA is developing a Computer Aided Process Engineering (CAPE) software tool for the metal finishing industry that helps users design efficient metal finishing processes that are less polluting to the environment. Metal finishing process lines can be simulated and evaluated...

  5. IMPLEMENTATION OF A CAPE-OPEN COMPLIANT PROCESS SIMULATOR USING MICROSOFT'S VISUAL STUDIO.NET AND THE .NET FRAMEWORK

    EPA Science Inventory

    The United States Environmental Protection Agency is developing a Computer
    Aided Process Engineering (CAPE) software tool for the metal finishing
    industry that helps users design efficient metal finishing processes that
    are less polluting to the environment. Metal finish...

  6. Graphics processing unit based computation for NDE applications

    NASA Astrophysics Data System (ADS)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  7. Face and construct validity of a computer-based virtual reality simulator for ERCP.

    PubMed

    Bittner, James G; Mellinger, John D; Imam, Toufic; Schade, Robert R; Macfadyen, Bruce V

    2010-02-01

    Currently, little evidence supports computer-based simulation for ERCP training. To determine face and construct validity of a computer-based simulator for ERCP and assess its perceived utility as a training tool. Novice and expert endoscopists completed 2 simulated ERCP cases by using the GI Mentor II. Virtual Education and Surgical Simulation Laboratory, Medical College of Georgia. Outcomes included times to complete the procedure, reach the papilla, and use fluoroscopy; attempts to cannulate the papilla, pancreatic duct, and common bile duct; and number of contrast injections and complications. Subjects assessed simulator graphics, procedural accuracy, difficulty, haptics, overall realism, and training potential. Only when performance data from cases A and B were combined did the GI Mentor II differentiate novices and experts based on times to complete the procedure, reach the papilla, and use fluoroscopy. Across skill levels, overall opinions were similar regarding graphics (moderately realistic), accuracy (similar to clinical ERCP), difficulty (similar to clinical ERCP), overall realism (moderately realistic), and haptics. Most participants (92%) claimed that the simulator has definite training potential or should be required for training. Small sample size, single institution. The GI Mentor II demonstrated construct validity for ERCP based on select metrics. Most subjects thought that the simulated graphics, procedural accuracy, and overall realism exhibit face validity. Subjects deemed it a useful training tool. Study repetition involving more participants and cases may help confirm results and establish the simulator's ability to differentiate skill levels based on ERCP-specific metrics.

  8. Unsteady flow simulations around complex geometries using stationary or rotating unstructured grids

    NASA Astrophysics Data System (ADS)

    Sezer-Uzol, Nilay

    In this research, the computational analysis of three-dimensional, unsteady, separated, vortical flows around complex geometries is studied by using stationary or moving unstructured grids. Two main engineering problems are investigated. The first problem is the unsteady simulation of a ship airwake, where helicopter operations become even more challenging, by using stationary unstructured grids. The second problem is the unsteady simulation of wind turbine rotor flow fields by using moving unstructured grids which are rotating with the whole three-dimensional rigid rotor geometry. The three dimensional, unsteady, parallel, unstructured, finite volume flow solver, PUMA2, is used for the computational fluid dynamics (CFD) simulations considered in this research. The code is modified to have a moving grid capability to perform three-dimensional, time-dependent rotor simulations. An instantaneous log-law wall model for Large Eddy Simulations is also implemented in PUMA2 to investigate the very large Reynolds number flow fields of rotating blades. To verify the code modifications, several sample test cases are also considered. In addition, interdisciplinary studies, which are aiming to provide new tools and insights to the aerospace and wind energy scientific communities, are done during this research by focusing on the coupling of ship airwake CFD simulations with the helicopter flight dynamics and control analysis, the coupling of wind turbine rotor CFD simulations with the aeroacoustic analysis, and the analysis of these time-dependent and large-scale CFD simulations with the help of a computational monitoring, steering and visualization tool, POSSE.

  9. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  10. Model-based surgical planning and simulation of cranial base surgery.

    PubMed

    Abe, M; Tabuchi, K; Goto, M; Uchino, A

    1998-11-01

    Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.

  11. Physical Processes and Applications of the Monte Carlo Radiative Energy Deposition (MRED) Code

    NASA Astrophysics Data System (ADS)

    Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Fleetwood, Daniel M.; Warren, Kevin M.; Sierawski, Brian D.; King, Michael P.; Schrimpf, Ronald D.; Auden, Elizabeth C.

    2015-08-01

    MRED is a Python-language scriptable computer application that simulates radiation transport. It is the computational engine for the on-line tool CRÈME-MC. MRED is based on c++ code from Geant4 with additional Fortran components to simulate electron transport and nuclear reactions with high precision. We provide a detailed description of the structure of MRED and the implementation of the simulation of physical processes used to simulate radiation effects in electronic devices and circuits. Extensive discussion and references are provided that illustrate the validation of models used to implement specific simulations of relevant physical processes. Several applications of MRED are summarized that demonstrate its ability to predict and describe basic physical phenomena associated with irradiation of electronic circuits and devices. These include effects from single particle radiation (including both direct ionization and indirect ionization effects), dose enhancement effects, and displacement damage effects. MRED simulations have also helped to identify new single event upset mechanisms not previously observed by experiment, but since confirmed, including upsets due to muons and energetic electrons.

  12. Image formation simulation for computer-aided inspection planning of machine vision systems

    NASA Astrophysics Data System (ADS)

    Irgenfried, Stephan; Bergmann, Stephan; Mohammadikaji, Mahsa; Beyerer, Jürgen; Dachsbacher, Carsten; Wörn, Heinz

    2017-06-01

    In this work, a simulation toolset for Computer Aided Inspection Planning (CAIP) of systems for automated optical inspection (AOI) is presented along with a versatile two-robot-setup for verification of simulation and system planning results. The toolset helps to narrow down the large design space of optical inspection systems in interaction with a system expert. The image formation taking place in optical inspection systems is simulated using GPU-based real time graphics and high quality off-line-rendering. The simulation pipeline allows a stepwise optimization of the system, from fast evaluation of surface patch visibility based on real time graphics up to evaluation of image processing results based on off-line global illumination calculation. A focus of this work is on the dependency of simulation quality on measuring, modeling and parameterizing the optical surface properties of the object to be inspected. The applicability to real world problems is demonstrated by taking the example of planning a 3D laser scanner application. Qualitative and quantitative comparison results of synthetic and real images are presented.

  13. Integrating GIS and ABM to Explore Spatiotemporal Dynamics

    NASA Astrophysics Data System (ADS)

    Sun, M.; Jiang, Y.; Yang, C.

    2013-12-01

    Agent-based modeling as a methodology for the bottom-up exploration with the account of adaptive behavior and heterogeneity of system components can help discover the development and pattern of the complex social and environmental system. However, ABM is a computationally intensive process especially when the number of system components becomes large and the agent-agent/agent-environmental interaction is modeled very complex. Most of traditional ABM frameworks developed based on CPU do not have a satisfying computing capacity. To address the problem and as the emergence of advanced techniques, GPU computing with CUDA can provide powerful parallel structure to enable the complex simulation of spatiotemporal dynamics. In this study, we first develop a GPU-based ABM system. Secondly, in order to visualize the dynamics generated from the movement of agent and the change of agent/environmental attributes during the simulation, we integrate GIS into the ABM system. Advanced geovisualization technologies can be utilized for representing the spatiotemporal change events, such as proper 2D/3D maps with state-of-the-art symbols, space-time cube and multiple layers each of which presents pattern in one time-stamp, etc. Thirdly, visual analytics which include interactive tools (e.g. grouping, filtering, linking, etc.) is included in our ABM-GIS system to help users conduct real-time data exploration during the progress of simulation. Analysis like flow analysis and spatial cluster analysis can be integrated according to the geographical problem we want to explore.

  14. Study of eigenfrequencies with the help of Prony's method

    NASA Astrophysics Data System (ADS)

    Drobakhin, O. O.; Olevskyi, O. V.; Olevskyi, V. I.

    2017-10-01

    Eigenfrequencies can be crucial in the design of a construction. They define many parameters that determine limit parameters of the structure. Exceeding these values can lead to the structural failure of an object. It is especially important in the design of structures which support heavy equipment or are subjected to the forces of airflow. One of the most effective ways to acquire the frequencies' values is a computer-based numerical simulation. The existing methods do not allow to acquire the whole range of needed parameters. It is well known that Prony's method, is highly effective for the investigation of dynamic processes. Thus, it is rational to adapt Prony's method for such investigation. The Prony method has advantage in comparison with other numerical schemes because it provides the possibility to process not only the results of numerical simulation, but also real experimental data. The research was carried out for a computer model of a steel plate. The input data was obtained by using the Dassault Systems SolidWorks computer package with the Simulation add-on. We investigated the acquired input data with the help of Prony's method. The result of the numerical experiment shows that Prony's method can be used to investigate the mechanical eigenfrequencies with good accuracy. The output of Prony's method not only contains the information about values of frequencies themselves, but also contains data regarding the amplitudes, initial phases and decaying factors of any given mode of oscillation, which can also be used in engineering.

  15. Strain System for the Motion Base Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Huber, David C.; Van Vossen, Karl G.; Kunkel, Glenn W.; Wells, Larry W.

    2010-01-01

    The Motion Base Shuttle Mission Simulator (MBSMS) Strain System is an innovative engineering tool used to monitor the stresses applied to the MBSMS motion platform tilt pivot frames during motion simulations in real time. The Strain System comprises hardware and software produced by several different companies. The system utilizes a series of strain gages, accelerometers, orientation sensor, rotational meter, scanners, computer, and software packages working in unison. By monitoring and recording the inputs applied to the simulator, data can be analyzed if weld cracks or other problems are found during routine simulator inspections. This will help engineers diagnose problems as well as aid in repair solutions for both current as well as potential problems.

  16. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  17. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  18. Simulation software: engineer processes before reengineering.

    PubMed

    Lepley, C J

    2001-01-01

    People make decisions all the time using intuition. But what happens when you are asked: "Are you sure your predictions are accurate? How much will a mistake cost? What are the risks associated with this change?" Once a new process is engineered, it is difficult to analyze what would have been different if other options had been chosen. Simulating a process can help senior clinical officers solve complex patient flow problems and avoid wasted efforts. Simulation software can give you the data you need to make decisions. The author introduces concepts, methodologies, and applications of computer aided simulation to illustrate their use in making decisions to improve workflow design.

  19. Modeling Effects of RNA on Capsid Assembly Pathways via Coarse-Grained Stochastic Simulation

    PubMed Central

    Smith, Gregory R.; Xie, Lu; Schwartz, Russell

    2016-01-01

    The environment of a living cell is vastly different from that of an in vitro reaction system, an issue that presents great challenges to the use of in vitro models, or computer simulations based on them, for understanding biochemistry in vivo. Virus capsids make an excellent model system for such questions because they typically have few distinct components, making them amenable to in vitro and modeling studies, yet their assembly can involve complex networks of possible reactions that cannot be resolved in detail by any current experimental technology. We previously fit kinetic simulation parameters to bulk in vitro assembly data to yield a close match between simulated and real data, and then used the simulations to study features of assembly that cannot be monitored experimentally. The present work seeks to project how assembly in these simulations fit to in vitro data would be altered by computationally adding features of the cellular environment to the system, specifically the presence of nucleic acid about which many capsids assemble. The major challenge of such work is computational: simulating fine-scale assembly pathways on the scale and in the parameter domains of real viruses is far too computationally costly to allow for explicit models of nucleic acid interaction. We bypass that limitation by applying analytical models of nucleic acid effects to adjust kinetic rate parameters learned from in vitro data to see how these adjustments, singly or in combination, might affect fine-scale assembly progress. The resulting simulations exhibit surprising behavioral complexity, with distinct effects often acting synergistically to drive efficient assembly and alter pathways relative to the in vitro model. The work demonstrates how computer simulations can help us understand how assembly might differ between the in vitro and in vivo environments and what features of the cellular environment account for these differences. PMID:27244559

  20. Using a million cell simulation of the cerebellum: network scaling and task generality.

    PubMed

    Li, Wen-Ke; Hausknecht, Matthew J; Stone, Peter; Mauk, Michael D

    2013-11-01

    Several factors combine to make it feasible to build computer simulations of the cerebellum and to test them in biologically realistic ways. These simulations can be used to help understand the computational contributions of various cerebellar components, including the relevance of the enormous number of neurons in the granule cell layer. In previous work we have used a simulation containing 12000 granule cells to develop new predictions and to account for various aspects of eyelid conditioning, a form of motor learning mediated by the cerebellum. Here we demonstrate the feasibility of scaling up this simulation to over one million granule cells using parallel graphics processing unit (GPU) technology. We observe that this increase in number of granule cells requires only twice the execution time of the smaller simulation on the GPU. We demonstrate that this simulation, like its smaller predecessor, can emulate certain basic features of conditioned eyelid responses, with a slight improvement in performance in one measure. We also use this simulation to examine the generality of the computation properties that we have derived from studying eyelid conditioning. We demonstrate that this scaled up simulation can learn a high level of performance in a classic machine learning task, the cart-pole balancing task. These results suggest that this parallel GPU technology can be used to build very large-scale simulations whose connectivity ratios match those of the real cerebellum and that these simulations can be used guide future studies on cerebellar mediated tasks and on machine learning problems. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. RAVEN: a GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Rabiti; D. Mandelli; A. Alfonsi

    Increases in computational power and pressure for more accurate simulations and estimations of accident scenario consequences are driving the need for Dynamic Probabilistic Risk Assessment (PRA) [1] of very complex models. While more sophisticated algorithms and computational power address the back end of this challenge, the front end is still handled by engineers that need to extract meaningful information from the large amount of data and build these complex models. Compounding this problem is the difficulty in knowledge transfer and retention, and the increasing speed of software development. The above-described issues would have negatively impacted deployment of the new highmore » fidelity plant simulator RELAP-7 (Reactor Excursion and Leak Analysis Program) at Idaho National Laboratory. Therefore, RAVEN that was initially focused to be the plant controller for RELAP-7 will help mitigate future RELAP-7 software engineering risks. In order to accomplish this task, Reactor Analysis and Virtual Control Environment (RAVEN) has been designed to provide an easy to use Graphical User Interface (GUI) for building plant models and to leverage artificial intelligence algorithms in order to reduce computational time, improve results, and help the user to identify the behavioral pattern of the Nuclear Power Plants (NPPs). In this paper we will present the GUI implementation and its current capability status. We will also introduce the support vector machine algorithms and show our evaluation of their potentiality in increasing the accuracy and reducing the computational costs of PRA analysis. In this evaluation we will refer to preliminary studies performed under the Risk Informed Safety Margins Characterization (RISMC) project of the Light Water Reactors Sustainability (LWRS) campaign [3]. RISMC simulation needs and algorithm testing are currently used as a guidance to prioritize RAVEN developments relevant to PRA.« less

  2. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  3. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html

  4. Use of Simulation Learning Experiences in Physical Therapy Entry-to-Practice Curricula: A Systematic Review

    PubMed Central

    Carnahan, Heather; Herold, Jodi

    2015-01-01

    ABSTRACT Purpose: To review the literature on simulation-based learning experiences and to examine their potential to have a positive impact on physiotherapy (PT) learners' knowledge, skills, and attitudes in entry-to-practice curricula. Method: A systematic literature search was conducted in the MEDLINE, CINAHL, Embase Classic+Embase, Scopus, and Web of Science databases, using keywords such as physical therapy, simulation, education, and students. Results: A total of 820 abstracts were screened, and 23 articles were included in the systematic review. While there were few randomized controlled trials with validated outcome measures, some discoveries about simulation can positively affect the design of the PT entry-to-practice curricula. Using simulators to provide specific output feedback can help students learn specific skills. Computer simulations can also augment students' learning experience. Human simulation experiences in managing the acute patient in the ICU are well received by students, positively influence their confidence, and decrease their anxiety. There is evidence that simulated learning environments can replace a portion of a full-time 4-week clinical rotation without impairing learning. Conclusions: Simulation-based learning activities are being effectively incorporated into PT curricula. More rigorously designed experimental studies that include a cost–benefit analysis are necessary to help curriculum developers make informed choices in curriculum design. PMID:25931672

  5. Monte Carlo errors with less errors

    NASA Astrophysics Data System (ADS)

    Wolff, Ulli; Alpha Collaboration

    2004-01-01

    We explain in detail how to estimate mean values and assess statistical errors for arbitrary functions of elementary observables in Monte Carlo simulations. The method is to estimate and sum the relevant autocorrelation functions, which is argued to produce more certain error estimates than binning techniques and hence to help toward a better exploitation of expensive simulations. An effective integrated autocorrelation time is computed which is suitable to benchmark efficiencies of simulation algorithms with regard to specific observables of interest. A Matlab code is offered for download that implements the method. It can also combine independent runs (replica) allowing to judge their consistency.

  6. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  7. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  8. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  9. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  10. Symplectic multi-particle tracking on GPUs

    NASA Astrophysics Data System (ADS)

    Liu, Zhicong; Qiang, Ji

    2018-05-01

    A symplectic multi-particle tracking model is implemented on the Graphic Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) language. The symplectic tracking model can preserve phase space structure and reduce non-physical effects in long term simulation, which is important for beam property evaluation in particle accelerators. Though this model is computationally expensive, it is very suitable for parallelization and can be accelerated significantly by using GPUs. In this paper, we optimized the implementation of the symplectic tracking model on both single GPU and multiple GPUs. Using a single GPU processor, the code achieves a factor of 2-10 speedup for a range of problem sizes compared with the time on a single state-of-the-art Central Processing Unit (CPU) node with similar power consumption and semiconductor technology. It also shows good scalability on a multi-GPU cluster at Oak Ridge Leadership Computing Facility. In an application to beam dynamics simulation, the GPU implementation helps save more than a factor of two total computing time in comparison to the CPU implementation.

  11. Simulation of the bimetal cast in the case of milling rolls

    NASA Astrophysics Data System (ADS)

    Mihut, G.; Popa, E.

    2015-06-01

    In the paper it is proposed, in main, to obtain of a model of numerical simulation, valid general and applicable the whole peculiars cases of bimetal casting, model with which help can be studied through the computer, the optimization possibility of flowing working condition of liquid alloy of the distribution of temperatures field, of the liquid phase and contraction during the solidification, with the minimum price (necessary reimbursement of the software and calculus equipment) in very short time etc.

  12. Data mining to support simulation modeling of patient flow in hospitals.

    PubMed

    Isken, Mark W; Rajagopalan, Balaji

    2002-04-01

    Spiraling health care costs in the United States are driving institutions to continually address the challenge of optimizing the use of scarce resources. One of the first steps towards optimizing resources is to utilize capacity effectively. For hospital capacity planning problems such as allocation of inpatient beds, computer simulation is often the method of choice. One of the more difficult aspects of using simulation models for such studies is the creation of a manageable set of patient types to include in the model. The objective of this paper is to demonstrate the potential of using data mining techniques, specifically clustering techniques such as K-means, to help guide the development of patient type definitions for purposes of building computer simulation or analytical models of patient flow in hospitals. Using data from a hospital in the Midwest this study brings forth several important issues that researchers need to address when applying clustering techniques in general and specifically to hospital data.

  13. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    PubMed

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily activities investigated. Therefore, such a framework can be adopted as a pre-clinical simulation approach to optimise different designs, materials, as well as patient's specific total knee replacements for a range of activities. Copyright © 2017. Published by Elsevier Ltd.

  14. Constructing a patient-specific computer model of the upper airway in sleep apnea patients.

    PubMed

    Dhaliwal, Sandeep S; Hesabgar, Seyyed M; Haddad, Seyyed M H; Ladak, Hanif; Samani, Abbas; Rotenberg, Brian W

    2018-01-01

    The use of computer simulation to develop a high-fidelity model has been proposed as a novel and cost-effective alternative to help guide therapeutic intervention in sleep apnea surgery. We describe a computer model based on patient-specific anatomy of obstructive sleep apnea (OSA) subjects wherein the percentage and sites of upper airway collapse are compared to findings on drug-induced sleep endoscopy (DISE). Basic science computer model generation. Three-dimensional finite element techniques were undertaken for model development in a pilot study of four OSA patients. Magnetic resonance imaging was used to capture patient anatomy and software employed to outline critical anatomical structures. A finite-element mesh was applied to the volume enclosed by each structure. Linear and hyperelastic soft-tissue properties for various subsites (tonsils, uvula, soft palate, and tongue base) were derived using an inverse finite-element technique from surgical specimens. Each model underwent computer simulation to determine the degree of displacement on various structures within the upper airway, and these findings were compared to DISE exams performed on the four study patients. Computer simulation predictions for percentage of airway collapse and site of maximal collapse show agreement with observed results seen on endoscopic visualization. Modeling the upper airway in OSA patients is feasible and holds promise in aiding patient-specific surgical treatment. NA. Laryngoscope, 128:277-282, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  15. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments.

    PubMed

    Thomas, Brandon R; Chylek, Lily A; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H A; Hlavacek, William S; Posner, Richard G

    2016-03-01

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary data are available at Bioinformatics online. bionetgen.help@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Computational Experiments for Science and Engineering Education

    NASA Technical Reports Server (NTRS)

    Xie, Charles

    2011-01-01

    How to integrate simulation-based engineering and science (SBES) into the science curriculum smoothly is a challenging question. For the importance of SBES to be appreciated, the core value of simulations-that they help people understand natural phenomena and solve engineering problems-must be taught. A strategy to achieve this goal is to introduce computational experiments to the science curriculum to replace or supplement textbook illustrations and exercises and to complement or frame hands-on or wet lab experiments. In this way, students will have an opportunity to learn about SBES without compromising other learning goals required by the standards and teachers will welcome these tools as they strengthen what they are already teaching. This paper demonstrates this idea using a number of examples in physics, chemistry, and engineering. These exemplary computational experiments show that it is possible to create a curriculum that is both deeper and wider.

  17. Using Virtual Reality To Bring Your Instruction to Life.

    ERIC Educational Resources Information Center

    Gaddis, Tony

    Prepared by the manager of a virtual reality (VR) laboratory at North Carolina's Haywood Community College, the three papers collected in this document are designed to help instructors incorporate VR into their classes. The first paper reviews the characteristics of VR, defining it as a computer-generated simulation of a three-dimensional…

  18. Evaluating the Cognitive Consequences of Playing "Portal" for a Short Duration

    ERIC Educational Resources Information Center

    Adams, Deanne M.; Pilegard, Celeste; Mayer, Richard E.

    2016-01-01

    Learning physics often requires overcoming common misconceptions based on naïve interpretations of observations in the everyday world. One proposed way to help learners build appropriate physics intuitions is to expose them to computer simulations in which motion is based on Newtonian principles. In addition, playing video games that require…

  19. Development of Polarized UV Raman and Infrared Emission/Absorption Spectroscopy for Rocket Engine Applications

    NASA Technical Reports Server (NTRS)

    Osborne, Robin; Wehrmeyer, Joseph; Farmer, Richard; Trinh, Huu; Dobson, Chris; Eskridge, Richard; Cramer, John; Hartfield, Roy; Turner, Jim (Technical Monitor)

    2001-01-01

    The objective of this project is to provide measurements of species concentrations and temperature for hot-fire test articles at Test Stand 115 at NASA Marshall Space Flight Center. Measurements can be useful for comparison to computational fluid dynamics simulations and help to evaluate combustion performance.

  20. Teaching the Meaning of Statistical Techniques with Microcomputer Simulation.

    ERIC Educational Resources Information Center

    Lee, Motoko Y.; And Others

    Students in an introductory statistics course are often preoccupied with learning the computational routines of specific summary statistics and thereby fail to develop an understanding of the meaning of those statistics or their conceptual basis. To help students develop a better understanding of the meaning of three frequently used statistics,…

  1. Blackbody Radiation from an Incandescent Lamp

    ERIC Educational Resources Information Center

    Ribeiro, C. I.

    2014-01-01

    In this article we propose an activity aimed at introductory students to help them understand the Stefan-Boltzmann and Wien's displacement laws. It only requires simple materials that are available at any school: an incandescent lamp, a variable dc energy supply, and a computer to run an interactive simulation of the blackbody spectrum.…

  2. Argonne News Brief: Making Sense of Noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The Argonne Leadership Computing Facility at Argonne National Laboratory helped Joe Nichols, of the University of Minnesota, to create high fidelity simulations of jet turbulence to determine how and where noise is produced. The results may lead to novel engineering designs that reduce noise over commercial flight paths and on aircraft carrier decks.

  3. Inquiry Style Interactive Virtual Experiments: A Case on Circular Motion

    ERIC Educational Resources Information Center

    Zhou, Shaona; Han, Jing; Pelz, Nathaniel; Wang, Xiaojun; Peng, Liangyu; Xiao, Hua; Bao, Lei

    2011-01-01

    Interest in computer-based learning, especially in the use of virtual reality simulations is increasing rapidly. While there are good reasons to believe that technologies have the potential to improve teaching and learning, how to utilize the technology effectively in teaching specific content difficulties is challenging. To help students develop…

  4. Planning and simulation of medical robot tasks.

    PubMed

    Raczkowsky, J; Bohner, P; Burghart, C; Grabowski, H

    1998-01-01

    Complex techniques for planning and performing surgery revolutionize medical interventions. In former times preoperative planning of interventions usually took place in the surgeons mind. Today's new computer techniques allow the surgeon to discuss various operation methods for a patient and to visualize them three-dimensionally. The use of computer assisted surgical planning helps to get better results of a treatment and supports the surgeon before and during the surgical intervention. In this paper we are presenting our planning and simulation system for operations in maxillo-facial surgery. All phases of a surgical intervention are supported. Chapter 1 gives a description of the medical motivation for our planning system and its environment. In Chapter 2 the basic components are presented. The planning system is depicted in Chapter 3 and a simulation of a robot assisted surgery can be found in Chapter 4. Chapter 5 concludes the paper and gives a survey about our future work.

  5. Multiple-body simulation with emphasis on integrated Space Shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau

    1993-01-01

    The program to obtain intergrid communications - Pegasus - was enhanced to make better use of computing resources. Periodic block tridiagonal and penta-diagonal diagonal routines in OVERFLOW were modified to use a better algorithm to speed up the calculation for grids with periodic boundary conditions. Several programs were added to collar grid tools and a user friendly shell script was developed to help users generate collar grids. User interface for HYPGEN was modified to cope with the changes in HYPGEN. ET/SRB attach hardware grids were added to the computational model for the space shuttle and is currently incorporated into the refined shuttle model jointly developed at Johnson Space Center and Ames Research Center. Flow simulation for the integrated space shuttle vehicle at flight Reynolds number was carried out and compared with flight data as well as the earlier simulation for wind tunnel Reynolds number.

  6. Controlling the error on target motion through real-time mesh adaptation: Applications to deep brain stimulation.

    PubMed

    Bui, Huu Phuoc; Tomar, Satyendra; Courtecuisse, Hadrien; Audette, Michel; Cotin, Stéphane; Bordas, Stéphane P A

    2018-05-01

    An error-controlled mesh refinement procedure for needle insertion simulations is presented. As an example, the procedure is applied for simulations of electrode implantation for deep brain stimulation. We take into account the brain shift phenomena occurring when a craniotomy is performed. We observe that the error in the computation of the displacement and stress fields is localised around the needle tip and the needle shaft during needle insertion simulation. By suitably and adaptively refining the mesh in this region, our approach enables to control, and thus to reduce, the error whilst maintaining a coarser mesh in other parts of the domain. Through academic and practical examples we demonstrate that our adaptive approach, as compared with a uniform coarse mesh, increases the accuracy of the displacement and stress fields around the needle shaft and, while for a given accuracy, saves computational time with respect to a uniform finer mesh. This facilitates real-time simulations. The proposed methodology has direct implications in increasing the accuracy, and controlling the computational expense of the simulation of percutaneous procedures such as biopsy, brachytherapy, regional anaesthesia, or cryotherapy. Moreover, the proposed approach can be helpful in the development of robotic surgeries because the simulation taking place in the control loop of a robot needs to be accurate, and to occur in real time. Copyright © 2018 John Wiley & Sons, Ltd.

  7. Stochastic optimization of GeantV code by use of genetic algorithms

    DOE PAGES

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; ...

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) andmore » handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. Here, the goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.« less

  8. Stochastic optimization of GeantV code by use of genetic algorithms

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) and handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. The goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.

  9. Stochastic optimization of GeantV code by use of genetic algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) andmore » handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. Here, the goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.« less

  10. Mathematical model of a rotational bioreactor for the dynamic cultivation of scaffold-adhered human mesenchymal stem cells for bone regeneration

    NASA Astrophysics Data System (ADS)

    Ganimedov, V. L.; Papaeva, E. O.; Maslov, N. A.; Larionov, P. M.

    2017-09-01

    Development of cell-mediated scaffold technologies for the treatment of critical bone defects is very important for the purpose of reparative bone regeneration. Today the properties of the bioreactor for cell-seeded scaffold cultivation are the subject of intensive research. We used the mathematical modeling of rotational reactor and construct computational algorithm with the help of ANSYS software package to develop this new procedure. The solution obtained with the help of the constructed computational algorithm is in good agreement with the analytical solution of Couette for the task of two coaxial cylinders. The series of flow computations for different rotation frequencies (1, 0.75, 0.5, 0.33, 1.125 Hz) was performed for the laminar flow regime approximation with the help of computational algorithm. It was found that Taylor vortices appear in the annular gap between the cylinders in a simulated bioreactor. It was obtained that shear stress in the range of interest (0.002-0.1 Pa) arise on outer surface of inner cylinder when it rotates with the frequency not exceeding 0.8 Hz. So the constructed mathematical model and the created computational algorithm for calculating the flow parameters allow predicting the shear stress and pressure values depending on the rotation frequency and geometric parameters, as well as optimizing the operating mode of the bioreactor.

  11. Theory and Simulation of Multicomponent Osmotic Systems

    PubMed Central

    Karunaweera, Sadish; Gee, Moon Bae; Weerasinghe, Samantha; Smith, Paul E.

    2012-01-01

    Most cellular processes occur in systems containing a variety of components many of which are open to material exchange. However, computer simulations of biological systems are almost exclusively performed in systems closed to material exchange. In principle, the behavior of biomolecules in open and closed systems will be different. Here, we provide a rigorous framework for the analysis of experimental and simulation data concerning open and closed multicomponent systems using the Kirkwood-Buff (KB) theory of solutions. The results are illustrated using computer simulations for various concentrations of the solutes Gly, Gly2 and Gly3 in both open and closed systems, and in the absence or presence of NaCl as a cosolvent. In addition, KB theory is used to help rationalize the aggregation properties of the solutes. Here one observes that the picture of solute association described by the KB integrals, which are directly related to the solution thermodynamics, and that provided by more physical clustering approaches are different. It is argued that the combination of KB theory and simulation data provides a simple and powerful tool for the analysis of complex multicomponent open and closed systems. PMID:23329894

  12. GPU-accelerated computation of electron transfer.

    PubMed

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  13. Computer simulation models as tools for identifying research needs: A black duck population model

    USGS Publications Warehouse

    Ringelman, J.K.; Longcore, J.R.

    1980-01-01

    Existing data on the mortality and production rates of the black duck (Anas rubripes) were used to construct a WATFIV computer simulation model. The yearly cycle was divided into 8 phases: hunting, wintering, reproductive, molt, post-molt, and juvenile dispersal mortality, and production from original and renesting attempts. The program computes population changes for sex and age classes during each phase. After completion of a standard simulation run with all variable default values in effect, a sensitivity analysis was conducted by changing each of 50 input variables, 1 at a time, to assess the responsiveness of the model to changes in each variable. Thirteen variables resulted in a substantial change in population level. Adult mortality factors were important during hunting and wintering phases. All production and mortality associated with original nesting attempts were sensitive, as was juvenile dispersal mortality. By identifying those factors which invoke the greatest population change, and providing an indication of the accuracy required in estimating these factors, the model helps to identify those variables which would be most profitable topics for future research.

  14. Computational Modeling and Simulations of Bioparticle Internalization Through Clathrin-mediated Endocytosis

    NASA Astrophysics Data System (ADS)

    Deng, Hua; Dutta, Prashanta; Liu, Jin

    2016-11-01

    Clathrin-mediated endocytosis (CME) is one of the most important endocytic pathways for the internalization of bioparticles at lipid membrane of cells, which plays crucial roles in fundamental understanding of viral infections and interacellular/transcelluar targeted drug delivery. During CME, highly dynamic clathrin-coated pit (CCP), formed by the growth of ordered clathrin lattices, is the key scaffolding component that drives the deformation of plasma membrane. Experimental studies have shown that CCP alone can provide sufficient membrane curvature for facilitating membrane invagination. However, currently there is no computational model that could couple cargo receptor binding with membrane invagination process, nor simulations of the dynamic growing process of CCP. We develop a stochastic computational model for the clathrin-mediated endocytosis based on Metropolis Monte Carlo simulations. In our model, the energetic costs of bending membrane and CCP are linked with antigen-antibody interactions. The assembly of clathrin lattices is a dynamic process that correlates with antigen-antibody bond formation. This model helps study the membrane deformation and the effects of CCP during functionalized bioparticles internalization through CME. This work is supported by NSF Grants: CBET-1250107 and CBET-1604211.

  15. Study of Wind Effects on Unique Buildings

    NASA Astrophysics Data System (ADS)

    Olenkov, V.; Puzyrev, P.

    2017-11-01

    The article deals with a numerical simulation of wind effects on the building of the Church of the Intercession of the Holy Virgin in the village Bulzi of the Chelyabinsk region. We presented a calculation algorithm and obtained pressure fields, velocity fields and the fields of kinetic energy of a wind stream, as well as streamlines. Computational fluid dynamic (CFD) evolved three decades ago at the interfaces of calculus mathematics and theoretical hydromechanics and has become a separate branch of science the subject of which is a numerical simulation of different fluid and gas flows as well as the solution of arising problems with the help of methods that involve computer systems. This scientific field which is of a great practical value is intensively developing. The increase in CFD-calculations is caused by the improvement of computer technologies, creation of multipurpose easy-to-use CFD-packagers that are available to a wide group of researchers and cope with various tasks. Such programs are not only competitive in comparison with physical experiments but sometimes they provide the only opportunity to answer the research questions. The following advantages of computer simulation can be pointed out: a) Reduction in time spent on design and development of a model in comparison with a real experiment (variation of boundary conditions). b) Numerical experiment allows for the simulation of conditions that are not reproducible with environmental tests (use of ideal gas as environment). c) Use of computational gas dynamics methods provides a researcher with a complete and ample information that is necessary to fully describe different processes of the experiment. d) Economic efficiency of computer calculations is more attractive than an experiment. e) Possibility to modify a computational model which ensures efficient timing (change of the sizes of wall layer cells in accordance with the chosen turbulence model).

  16. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  17. Towards an integrative computational model for simulating tumor growth and response to radiation therapy

    NASA Astrophysics Data System (ADS)

    Marrero, Carlos Sosa; Aubert, Vivien; Ciferri, Nicolas; Hernández, Alfredo; de Crevoisier, Renaud; Acosta, Oscar

    2017-11-01

    Understanding the response to irradiation in cancer radiotherapy (RT) may help devising new strategies with improved tumor local control. Computational models may allow to unravel the underlying radiosensitive mechanisms intervening in the dose-response relationship. By using extensive simulations a wide range of parameters may be evaluated providing insights on tumor response thus generating useful data to plan modified treatments. We propose in this paper a computational model of tumor growth and radiation response which allows to simulate a whole RT protocol. Proliferation of tumor cells, cell life-cycle, oxygen diffusion, radiosensitivity, RT response and resorption of killed cells were implemented in a multiscale framework. The model was developed in C++, using the Multi-formalism Modeling and Simulation Library (M2SL). Radiosensitivity parameters extracted from literature enabled us to simulate in a regular grid (voxel-wise) a prostate cell tissue. Histopathological specimens with different aggressiveness levels extracted from patients after prostatectomy were used to initialize in silico simulations. Results on tumor growth exhibit a good agreement with data from in vitro studies. Moreover, standard fractionation of 2 Gy/fraction, with a total dose of 80 Gy as a real RT treatment was applied with varying radiosensitivity and oxygen diffusion parameters. As expected, the high influence of these parameters was observed by measuring the percentage of survival tumor cell after RT. This work paves the way to further models allowing to simulate increased doses in modified hypofractionated schemes and to develop new patient-specific combined therapies.

  18. The importance of employing computational resources for the automation of drug discovery.

    PubMed

    Rosales-Hernández, Martha Cecilia; Correa-Basurto, José

    2015-03-01

    The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.

  19. Following the Ions through a Mass Spectrometer with Atmospheric Pressure Interface: Simulation of Complete Ion Trajectories from Ion Source to Mass Analyzer.

    PubMed

    Zhou, Xiaoyu; Ouyang, Zheng

    2016-07-19

    Ion trajectory simulation is an important and useful tool in instrumentation development for mass spectrometry. Accurate simulation of the ion motion through the mass spectrometer with atmospheric pressure ionization source has been extremely challenging, due to the complexity in gas hydrodynamic flow field across a wide pressure range as well as the computational burden. In this study, we developed a method of generating the gas flow field for an entire mass spectrometer with an atmospheric pressure interface. In combination with the electric force, for the first time simulation of ion trajectories from an atmospheric pressure ion source to a mass analyzer in vacuum has been enabled. A stage-by-stage ion repopulation method has also been implemented for the simulation, which helped to avoid an intolerable computational burden for simulations at high pressure regions while it allowed statistically meaningful results obtained for the mass analyzer. It has been demonstrated to be suitable to identify a joint point for combining the high and low pressure fields solved individually. Experimental characterization has also been done to validate the new method for simulation. Good agreement was obtained between simulated and experimental results for ion transfer though an atmospheric pressure interface with a curtain gas.

  20. Simulating the x-ray image contrast to setup techniques with desired flaw detectability

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2015-04-01

    The paper provides simulation data of previous work by the author in developing a model for estimating detectability of crack-like flaws in radiography. The methodology is developed to help in implementation of NASA Special x-ray radiography qualification, but is generically applicable to radiography. The paper describes a method for characterizing the detector resolution. Applicability of ASTM E 2737 resolution requirements to the model are also discussed. The paper describes a model for simulating the detector resolution. A computer calculator application, discussed here, also performs predicted contrast and signal-to-noise ratio calculations. Results of various simulation runs in calculating x-ray flaw size parameter and image contrast for varying input parameters such as crack depth, crack width, part thickness, x-ray angle, part-to-detector distance, part-to-source distance, source sizes, and detector sensitivity and resolution are given as 3D surfaces. These results demonstrate effect of the input parameters on the flaw size parameter and the simulated image contrast of the crack. These simulations demonstrate utility of the flaw size parameter model in setting up x-ray techniques that provide desired flaw detectability in radiography. The method is applicable to film radiography, computed radiography, and digital radiography.

  1. The Influence of Complexity and Uncertainty on Self-Directed Team Learning

    ERIC Educational Resources Information Center

    Gray, David

    2012-01-01

    To help increase the effectiveness of self-directed teams, this paper studies the attitudes and behaviour of self-directed team members during the course of a computer simulated marketing strategy game. Self-directed teams are used widely throughout organisations yet receive little scrutiny when they undertake a task which is subject to conditions…

  2. METCAN demonstration manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Lee, H.-J.; Murthy, P. L. N.

    1992-01-01

    The various features of the Metal Matrix Composite Analyzer (METCAN) computer program to simulate the high temperature nonlinear behavior of continuous fiber reinforced metal matrix composites are demonstrated. Different problems are used to demonstrate various capabilities of METCAN for both static and cyclic analyses. A complete description of the METCAN output file is also included to help interpret results.

  3. Learning to Measure Biodiversity: Two Agent-Based Models that Simulate Sampling Methods & Provide Data for Calculating Diversity Indices

    ERIC Educational Resources Information Center

    Jones, Thomas; Laughlin, Thomas

    2009-01-01

    Nothing could be more effective than a wilderness experience to demonstrate the importance of conserving biodiversity. When that is not possible, though, there are computer models with several features that are helpful in understanding how biodiversity is measured. These models are easily used when natural resources, transportation, and time…

  4. Understanding the Theory and Practice of Molecular Spectroscopy: The Effects of Spectral Bandwidth

    ERIC Educational Resources Information Center

    Hirayama, Satoshi; Steer, Ronald P.

    2010-01-01

    The near-UV spectrum of benzene is used to illustrate the effects of variations in instrument spectral bandwidth on absorbance and molar absorptivity measurements and on the independence of values of quantities such as the oscillator strength that are based on integrated absorptivity. Excel-based computer simulations are provided that help develop…

  5. Reinforcing Concepts of Transient Heat Conduction and Convection with Simple Experiments and COMSOL Simulations

    ERIC Educational Resources Information Center

    Mendez, Sergio; AungYong, Lisa

    2014-01-01

    To help students make the connection between the concepts of heat conduction and convection to real-world phenomenon, we developed a combined experimental and computational module that can be incorporated into lecture or lab courses. The experimental system we present requires materials and apparatus that are readily accessible, and the procedure…

  6. Analysis of MD5 authentication in various routing protocols using simulation tools

    NASA Astrophysics Data System (ADS)

    Dinakaran, M.; Darshan, K. N.; Patel, Harsh

    2017-11-01

    Authentication being an important paradigm of security and Computer Networks require secure paths to make the flow of the data even more secure through some security protocols. So MD-5(Message Digest 5) helps in providing data integrity to the data being sent through it and authentication to the network devices. This paper gives a brief introduction to the MD-5, simulation of the networks by including MD-5 authentication using various routing protocols like OSPF, EIGRP and RIPv2. GNS3 is being used to simulate the scenarios. Analysis of the MD-5 authentication is done in the later sections of the paper.

  7. A 3-D Approach for Teaching and Learning about Surface Water Systems through Computational Thinking, Data Visualization and Physical Models

    NASA Astrophysics Data System (ADS)

    Caplan, B.; Morrison, A.; Moore, J. C.; Berkowitz, A. R.

    2017-12-01

    Understanding water is central to understanding environmental challenges. Scientists use `big data' and computational models to develop knowledge about the structure and function of complex systems, and to make predictions about changes in climate, weather, hydrology, and ecology. Large environmental systems-related data sets and simulation models are difficult for high school teachers and students to access and make sense of. Comp Hydro, a collaboration across four states and multiple school districts, integrates computational thinking and data-related science practices into water systems instruction to enhance development of scientific model-based reasoning, through curriculum, assessment and teacher professional development. Comp Hydro addresses the need for 1) teaching materials for using data and physical models of hydrological phenomena, 2) building teachers' and students' comfort or familiarity with data analysis and modeling, and 3) infusing the computational knowledge and practices necessary to model and visualize hydrologic processes into instruction. Comp Hydro teams in Baltimore, MD and Fort Collins, CO are integrating teaching about surface water systems into high school courses focusing on flooding (MD) and surface water reservoirs (CO). This interactive session will highlight the successes and challenges of our physical and simulation models in helping teachers and students develop proficiency with computational thinking about surface water. We also will share insights from comparing teacher-led vs. project-led development of curriculum and our simulations.

  8. Rehearsing decisions may help teenagers: an evaluation of a simulation game.

    PubMed

    Alemi, F; Cherry, F; Meffert, G

    1989-01-01

    This paper presents a new approach to preventing adolescent pregnancy. Information alone is not sufficient to prevent teenage pregnancy. The teenagers ability to choose and remain committed to a decision also needs to be developed. Because decision making skills are best learned through practice in an environment with frequent feedback, we have developed a computer game which simulates the consequences of different sexual roles. In addition, the game is intended to increase communication about sex between teenagers and their role models (peers, teachers and/or parents). Increased communication is expected to reduce the feeling of guilt and lead to either consistent abstention from sex or consistent contraceptive use. The paper reports on the development of the computer game and the preliminary evaluation of its impact.

  9. Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs

    NASA Astrophysics Data System (ADS)

    Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen

    2018-02-01

    Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.

  10. Model input and output files for the simulation of time of arrival of landfill leachate at the water table, Municipal Solid Waste Landfill Facility, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso County, Texas

    USGS Publications Warehouse

    Abeyta, Cynthia G.; Frenzel, Peter F.

    1999-01-01

    This report contains listings of model input and output files for the simulation of the time of arrival of landfill leachate at the water table from the Municipal Solid Waste Landfill Facility (MSWLF), about 10 miles northeast of downtown El Paso, Texas. This simulation was done by the U.S. Geological Survey in cooperation with the U.S. Department of the Army, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso, Texas. The U.S. Environmental Protection Agency-developed Hydrologic Evaluation of Landfill Performance (HELP) and Multimedia Exposure Assessment (MULTIMED) computer models were used to simulate the production of leachate by a landfill and transport of landfill leachate to the water table. Model input data files used with and output files generated by the HELP and MULTIMED models are provided in ASCII format on a 3.5-inch 1.44-megabyte IBM-PC compatible floppy disk.

  11. [Application of fluid mechanics and simulation: urinary tract and ureteral catheters.

    PubMed

    Gómez-Blanco, J C; Martínez-Reina, J; Cruz, D; Blas Pagador, J; Sánchez-Margallo, F M; Soria, F

    2016-10-01

    The mechanics of urine during its transport from the renal pelvis to the bladder is of great interest for urologists. The knowledge of the different physical variables and their interrelationship, both in physiologic movements and pathologies, will help a better diagnosis and treatment. The objective of this chapter is to show the physics principles and their most relevant basic relations in urine transport, and to bring them over the clinical world. For that, we explain the movement of urine during peristalsis, ureteral obstruction and in a ureter with a stent. This explanation is based in two tools used in bioengineering: the theoretical analysis through the Theory of concontinuous media and Ffluid mechanics and computational simulation that offers a practical solution for each scenario. Moreover, we review other contributions of bioengineering to the field of Urology, such as physical simulation or additive and subtractive manufacturing techniques. Finally, we list the current limitations for these tools and the technological development lines with more future projection. In this chapter we aim to help urologists to understand some important concepts of bioengineering, promoting multidisciplinary cooperation to offer complementary tools that help in diagnosis and treatment of diseases.

  12. Markov Jump-Linear Performance Models for Recoverable Flight Control Computers

    NASA Technical Reports Server (NTRS)

    Zhang, Hong; Gray, W. Steven; Gonzalez, Oscar R.

    2004-01-01

    Single event upsets in digital flight control hardware induced by atmospheric neutrons can reduce system performance and possibly introduce a safety hazard. One method currently under investigation to help mitigate the effects of these upsets is NASA Langley s Recoverable Computer System. In this paper, a Markov jump-linear model is developed for a recoverable flight control system, which will be validated using data from future experiments with simulated and real neutron environments. The method of tracking error analysis and the plan for the experiments are also described.

  13. Finite Element Analysis of Osteocytes Mechanosensitivity Under Simulated Microgravity

    NASA Astrophysics Data System (ADS)

    Yang, Xiao; Sun, Lian-Wen; Du, Cheng-Fei; Wu, Xin-Tong; Fan, Yu-Bo

    2018-04-01

    It was found that the mechanosensitivity of osteocytes could be altered under simulated microgravity. However, how the mechanical stimuli as the biomechanical origins cause the bioresponse in osteocytes under microgravity is unclear yet. Computational studies may help us to explore the mechanical deformation changes of osteocytes under microgravity. Here in this paper, we intend to use the computational simulation to investigate the mechanical behavior of osteocytes under simulated microgravity. In order to obtain the shape information of osteocytes, the biological experiment was conducted under simulated microgravity prior to the numerical simulation The cells were rotated by a clinostat for 6 hours or 5 days and fixed, the cytoskeleton and the nucleus were immunofluorescence stained and scanned, and the cell shape and the fluorescent intensity were measured from fluorescent images to get the dimension information of osteocytes The 3D finite element (FE) cell models were then established based on the scanned image stacks. Several components such as the actin cortex, the cytoplasm, the nucleus, the cytoskeleton of F-actin and microtubules were considered in the model. The cell models in both 6 hours and 5 days groups were then imposed by three magnitudes (0.5, 10 and 15 Pa) of simulating fluid shear stress, with cell total displacement and the internal discrete components deformation calculated. The results showed that under the simulated microgravity: (1) the nuclear area and height statistically significantly increased, which made the ratio of membrane-cortex height to nucleus height statistically significantly decreased; (2) the fluid shear stress-induced maximum displacements and average displacements in the whole cell decreased, with the deformation decreasing amplitude was largest when exposed to 1.5Pa of fluid shear stress; (3) the fluid shear stress-induced deformation of cell membrane-cortex and cytoskeleton decreased, while the fluid shear stress-induced deformation of nucleus increased. The results suggested the mechanical behavior of whole osteocyte cell body was suppressed by simulated microgravity, and this decrement was enlarged with either the increasing amplitude of fluid shear stress or the duration of simulated microgravity. What's more, the mechanical behavior of membrane-cortex and cytoskeleton was suppressed by the simulated microgravity, which indicated the mechanotransduction process in the cell body may be further inhibited. On the contrary, the cell nucleus deformation increased under simulated microgravity, which may be related to either the decreased amount of cytoskeleton or the increased volume occupied proportion of nucleus in whole cell under the simulated microgravity. The numerical results supported our previous biological experiments, and showed particularly affected cellular components under the simulated microgravity. The computational study here may help us to better understand the mechanism of mechanosensitivity changes in osteocytes under simulated microgravity, and further to explore the mechanism of the bone loss in space flight.

  14. Credibility Assessment of Deterministic Computational Models and Simulations for Space Biomedical Research and Operations

    NASA Technical Reports Server (NTRS)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry

    2015-01-01

    Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.

  15. Slat Cove Unsteadiness Effect of 3D Flow Structures

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan M.; Khorrami, Mehdi R.

    2006-01-01

    Previous studies have indicated that 2D, time accurate computations based on a pseudo-laminar zonal model of the slat cove region (within the framework of the Reynolds-Averaged Navier-Stokes equations) are inadequate for predicting the full unsteady dynamics of the slat cove flow field. Even though such computations could capture the large-scale, unsteady vorticity structures in the slat cove region without requiring any external forcing, the simulated vortices were excessively strong and the recirculation zone was unduly energetic in comparison with the PIV measurements for a generic high-lift configuration. To resolve this discrepancy and to help enable physics based predictions of slat aeroacoustics, the present paper is focused on 3D simulations of the slat cove flow over a computational domain of limited spanwise extent. Maintaining the pseudo-laminar approach, current results indicate that accounting for the three-dimensionality of flow fluctuations leads to considerable improvement in the accuracy of the unsteady, nearfield solution. Analysis of simulation data points to the likely significance of turbulent fluctuations near the reattachment region toward the generation of broadband slat noise. The computed acoustic characteristics (in terms of the frequency spectrum and spatial distribution) within short distances from the slat resemble the previously reported, subscale measurements of slat noise.

  16. DIATOM (Data Initialization and Modification) Library Version 7.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, David A.; Schmitt, Robert G.; Hensinger, David M.

    DIATOM is a library that provides numerical simulation software with a computational geometry front end that can be used to build up complex problem geometries from collections of simpler shapes. The library provides a parser which allows for application-independent geometry descriptions to be embedded in simulation software input decks. Descriptions take the form of collections of primitive shapes and/or CAD input files and material properties that can be used to describe complex spatial and temporal distributions of numerical quantities (often called “database variables” or “fields”) to help define starting conditions for numerical simulations. The capability is designed to be generalmore » purpose, robust and computationally efficient. By using a combination of computational geometry and recursive divide-and-conquer approximation techniques, a wide range of primitive shapes are supported to arbitrary degrees of fidelity, controllable through user input and limited only by machine resources. Through the use of call-back functions, numerical simulation software can request the value of a field at any time or location in the problem domain. Typically, this is used only for defining initial conditions, but the capability is not limited to just that use. The most recent version of DIATOM provides the ability to import the solution field from one numerical solution as input for another.« less

  17. RotCFD Analysis of the AH-56 Cheyenne Hub Drag

    NASA Technical Reports Server (NTRS)

    Solis, Eduardo; Bass, Tal A.; Keith, Matthew D.; Oppenheim, Rebecca T.; Runyon, Bryan T.; Veras-Alba, Belen

    2016-01-01

    In 2016, the U.S. Army Aviation Development Directorate (ADD) conducted tests in the U.S. Army 7- by 10- Foot Wind Tunnel at NASA Ames Research Center of a nonrotating 2/5th-scale AH-56 rotor hub. The objective of the tests was to determine how removing the mechanical control gyro affected the drag. Data for the lift, drag, and pitching moment were recorded for the 4-bladed rotor hub in various hardware configurations, azimuth angles, and angles of attack. Numerical simulations of a selection of the configurations and orientations were then performed, and the results were compared with the test data. To generate the simulation results, the hardware configurations were modeled using Creo and Rhinoceros 5, three-dimensional surface modeling computer-aided design (CAD) programs. The CAD model was imported into Rotorcraft Computational Fluid Dynamics (RotCFD), a computational fluid dynamics (CFD) tool used for analyzing rotor flow fields. RotCFD simulation results were compared with the experimental results of three hardware configurations at two azimuth angles, two angles of attack, and with and without wind tunnel walls. The results help validate RotCFD as a tool for analyzing low-drag rotor hub designs for advanced high-speed rotorcraft concepts. Future work will involve simulating additional hub geometries to reduce drag or tailor to other desired performance levels.

  18. Researcher's guide to the NASA Ames Flight Simulator for Advanced Aircraft (FSAA)

    NASA Technical Reports Server (NTRS)

    Sinacori, J. B.; Stapleford, R. L.; Jewell, W. F.; Lehman, J. M.

    1977-01-01

    Performance, limitations, supporting software, and current checkout and operating procedures are presented for the flight simulator, in terms useful to the researcher who intends to use it. Suggestions to help the researcher prepare the experimental plan are also given. The FSAA's central computer, cockpit, and visual and motion systems are addressed individually but their interaction is considered as well. Data required, available options, user responsibilities, and occupancy procedures are given in a form that facilitates the initial communication required with the NASA operations' group.

  19. Editorial: Challenges for the usability of AR and VR for clinical neurosurgical procedures.

    PubMed

    de Ribaupierre, Sandrine; Eagleson, Roy

    2017-10-01

    There are a number of challenges that must be faced when trying to develop AR and VR-based Neurosurgical simulators, Surgical Navigation Platforms, and "Smart OR" systems. Trying to simulate an operating room environment and surgical tasks in Augmented and Virtual Reality is a challenge many are attempting to solve, in order to train surgeons or help them operate. What are some of the needs of the surgeon, and what are the challenges encountered (human computer interface, perception, workflow, etc). We discuss these tradeoffs and conclude with critical remarks.

  20. Computationally efficient methods for modelling laser wakefield acceleration in the blowout regime

    NASA Astrophysics Data System (ADS)

    Cowan, B. M.; Kalmykov, S. Y.; Beck, A.; Davoine, X.; Bunkers, K.; Lifschitz, A. F.; Lefebvre, E.; Bruhwiler, D. L.; Shadwick, B. A.; Umstadter, D. P.; Umstadter

    2012-08-01

    Electron self-injection and acceleration until dephasing in the blowout regime is studied for a set of initial conditions typical of recent experiments with 100-terawatt-class lasers. Two different approaches to computationally efficient, fully explicit, 3D particle-in-cell modelling are examined. First, the Cartesian code vorpal (Nieter, C. and Cary, J. R. 2004 VORPAL: a versatile plasma simulation code. J. Comput. Phys. 196, 538) using a perfect-dispersion electromagnetic solver precisely describes the laser pulse and bubble dynamics, taking advantage of coarser resolution in the propagation direction, with a proportionally larger time step. Using third-order splines for macroparticles helps suppress the sampling noise while keeping the usage of computational resources modest. The second way to reduce the simulation load is using reduced-geometry codes. In our case, the quasi-cylindrical code calder-circ (Lifschitz, A. F. et al. 2009 Particle-in-cell modelling of laser-plasma interaction using Fourier decomposition. J. Comput. Phys. 228(5), 1803-1814) uses decomposition of fields and currents into a set of poloidal modes, while the macroparticles move in the Cartesian 3D space. Cylindrical symmetry of the interaction allows using just two modes, reducing the computational load to roughly that of a planar Cartesian simulation while preserving the 3D nature of the interaction. This significant economy of resources allows using fine resolution in the direction of propagation and a small time step, making numerical dispersion vanishingly small, together with a large number of particles per cell, enabling good particle statistics. Quantitative agreement of two simulations indicates that these are free of numerical artefacts. Both approaches thus retrieve the physically correct evolution of the plasma bubble, recovering the intrinsic connection of electron self-injection to the nonlinear optical evolution of the driver.

  1. The winding road to being a code monkey

    NASA Astrophysics Data System (ADS)

    Sarahan, Michael

    2017-09-01

    I am now a software engineer at a company that provides data analytics services, and helps support the open source data science community. I have been a computer nerd for a very long time, but it was my CEU experience at Texas A&M with Sherry Yennello (2003-2005) that helped me put my nerd skills to productive use. My project then was simulation of pulse shape discrimination electronics, and it was an excellent introduction to core computational concerns, such as digitization: when you see a line on the screen, that's not really how the computer sees it. I wandered in graduate school through a chemistry program into using electron microscopes. My programming interest got me into image and signal processing, which led naturally to jobs in analyzing data, and also in acquiring data. Throughout, it was always difficult just to make software work. I got pretty good at making it work. That's what I do for a living now - package software so that it is easy for other people to do great science with.

  2. Direct numerical simulation of reactor two-phase flows enabled by high-performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Jun; Cambareri, Joseph J.; Brown, Cameron S.

    Nuclear reactor two-phase flows remain a great engineering challenge, where the high-resolution two-phase flow database which can inform practical model development is still sparse due to the extreme reactor operation conditions and measurement difficulties. Owing to the rapid growth of computing power, the direct numerical simulation (DNS) is enjoying a renewed interest in investigating the related flow problems. A combination between DNS and an interface tracking method can provide a unique opportunity to study two-phase flows based on first principles calculations. More importantly, state-of-the-art high-performance computing (HPC) facilities are helping unlock this great potential. This paper reviews the recent researchmore » progress of two-phase flow DNS related to reactor applications. The progress in large-scale bubbly flow DNS has been focused not only on the sheer size of those simulations in terms of resolved Reynolds number, but also on the associated advanced modeling and analysis techniques. Specifically, the current areas of active research include modeling of sub-cooled boiling, bubble coalescence, as well as the advanced post-processing toolkit for bubbly flow simulations in reactor geometries. A novel bubble tracking method has been developed to track the evolution of bubbles in two-phase bubbly flow. Also, spectral analysis of DNS database in different geometries has been performed to investigate the modulation of the energy spectrum slope due to bubble-induced turbulence. In addition, the single-and two-phase analysis results are presented for turbulent flows within the pressurized water reactor (PWR) core geometries. The related simulations are possible to carry out only with the world leading HPC platforms. These simulations are allowing more complex turbulence model development and validation for use in 3D multiphase computational fluid dynamics (M-CFD) codes.« less

  3. PREFACE: New trends in Computer Simulations in Physics and not only in physics

    NASA Astrophysics Data System (ADS)

    Shchur, Lev N.; Krashakov, Serge A.

    2016-02-01

    In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf

  4. Tangible Landscape: Cognitively Grasping the Flow of Water

    NASA Astrophysics Data System (ADS)

    Harmon, B. A.; Petrasova, A.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2016-06-01

    Complex spatial forms like topography can be challenging to understand, much less intentionally shape, given the heavy cognitive load of visualizing and manipulating 3D form. Spatiotemporal processes like the flow of water over a landscape are even more challenging to understand and intentionally direct as they are dependent upon their context and require the simulation of forces like gravity and momentum. This cognitive work can be offloaded onto computers through 3D geospatial modeling, analysis, and simulation. Interacting with computers, however, can also be challenging, often requiring training and highly abstract thinking. Tangible computing - an emerging paradigm of human-computer interaction in which data is physically manifested so that users can feel it and directly manipulate it - aims to offload this added cognitive work onto the body. We have designed Tangible Landscape, a tangible interface powered by an open source geographic information system (GRASS GIS), so that users can naturally shape topography and interact with simulated processes with their hands in order to make observations, generate and test hypotheses, and make inferences about scientific phenomena in a rapid, iterative process. Conceptually Tangible Landscape couples a malleable physical model with a digital model of a landscape through a continuous cycle of 3D scanning, geospatial modeling, and projection. We ran a flow modeling experiment to test whether tangible interfaces like this can effectively enhance spatial performance by offloading cognitive processes onto computers and our bodies. We used hydrological simulations and statistics to quantitatively assess spatial performance. We found that Tangible Landscape enhanced 3D spatial performance and helped users understand water flow.

  5. Using CamiTK for rapid prototyping of interactive computer assisted medical intervention applications.

    PubMed

    Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne

    2013-01-01

    Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.

  6. Multiscale systems biology of trauma-induced coagulopathy.

    PubMed

    Tsiklidis, Evan; Sims, Carrie; Sinno, Talid; Diamond, Scott L

    2018-07-01

    Trauma with hypovolemic shock is an extreme pathological state that challenges the body to maintain blood pressure and oxygenation in the face of hemorrhagic blood loss. In conjunction with surgical actions and transfusion therapy, survival requires the patient's blood to maintain hemostasis to stop bleeding. The physics of the problem are multiscale: (a) the systemic circulation sets the global blood pressure in response to blood loss and resuscitation therapy, (b) local tissue perfusion is altered by localized vasoregulatory mechanisms and bleeding, and (c) altered blood and vessel biology resulting from the trauma as well as local hemodynamics control the assembly of clotting components at the site of injury. Building upon ongoing modeling efforts to simulate arterial or venous thrombosis in a diseased vasculature, computer simulation of trauma-induced coagulopathy is an emerging approach to understand patient risk and predict response. Despite uncertainties in quantifying the patient's dynamic injury burden, multiscale systems biology may help link blood biochemistry at the molecular level to multiorgan responses in the bleeding patient. As an important goal of systems modeling, establishing early metrics of a patient's high-dimensional trajectory may help guide transfusion therapy or warn of subsequent later stage bleeding or thrombotic risks. This article is categorized under: Analytical and Computational Methods > Computational Methods Biological Mechanisms > Regulatory Biology Models of Systems Properties and Processes > Mechanistic Models. © 2018 Wiley Periodicals, Inc.

  7. Implementation of interconnect simulation tools in spice

    NASA Technical Reports Server (NTRS)

    Satsangi, H.; Schutt-Aine, J. E.

    1993-01-01

    Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.

  8. Parallel simulation of tsunami inundation on a large-scale supercomputer

    NASA Astrophysics Data System (ADS)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2013-12-01

    An accurate prediction of tsunami inundation is important for disaster mitigation purposes. One approach is to approximate the tsunami wave source through an instant inversion analysis using real-time observation data (e.g., Tsushima et al., 2009) and then use the resulting wave source data in an instant tsunami inundation simulation. However, a bottleneck of this approach is the large computational cost of the non-linear inundation simulation and the computational power of recent massively parallel supercomputers is helpful to enable faster than real-time execution of a tsunami inundation simulation. Parallel computers have become approximately 1000 times faster in 10 years (www.top500.org), and so it is expected that very fast parallel computers will be more and more prevalent in the near future. Therefore, it is important to investigate how to efficiently conduct a tsunami simulation on parallel computers. In this study, we are targeting very fast tsunami inundation simulations on the K computer, currently the fastest Japanese supercomputer, which has a theoretical peak performance of 11.2 PFLOPS. One computing node of the K computer consists of 1 CPU with 8 cores that share memory, and the nodes are connected through a high-performance torus-mesh network. The K computer is designed for distributed-memory parallel computation, so we have developed a parallel tsunami model. Our model is based on TUNAMI-N2 model of Tohoku University, which is based on a leap-frog finite difference method. A grid nesting scheme is employed to apply high-resolution grids only at the coastal regions. To balance the computation load of each CPU in the parallelization, CPUs are first allocated to each nested layer in proportion to the number of grid points of the nested layer. Using CPUs allocated to each layer, 1-D domain decomposition is performed on each layer. In the parallel computation, three types of communication are necessary: (1) communication to adjacent neighbours for the finite difference calculation, (2) communication between adjacent layers for the calculations to connect each layer, and (3) global communication to obtain the time step which satisfies the CFL condition in the whole domain. A preliminary test on the K computer showed the parallel efficiency on 1024 cores was 57% relative to 64 cores. We estimate that the parallel efficiency will be considerably improved by applying a 2-D domain decomposition instead of the present 1-D domain decomposition in future work. The present parallel tsunami model was applied to the 2011 Great Tohoku tsunami. The coarsest resolution layer covers a 758 km × 1155 km region with a 405 m grid spacing. A nesting of five layers was used with the resolution ratio of 1/3 between nested layers. The finest resolution region has 5 m resolution and covers most of the coastal region of Sendai city. To complete 2 hours of simulation time, the serial (non-parallel) computation took approximately 4 days on a workstation. To complete the same simulation on 1024 cores of the K computer, it took 45 minutes which is more than two times faster than real-time. This presentation discusses the updated parallel computational performance and the efficient use of the K computer when considering the characteristics of the tsunami inundation simulation model in relation to the characteristics and capabilities of the K computer.

  9. Fast simulation of the NICER instrument

    NASA Astrophysics Data System (ADS)

    Doty, John P.; Wampler-Doty, Matthew P.; Prigozhin, Gregory Y.; Okajima, Takashi; Arzoumanian, Zaven; Gendreau, Keith

    2016-07-01

    The NICER1 mission uses a complicated physical system to collect information from objects that are, by x-ray timing science standards, rather faint. To get the most out of the data we will need a rigorous understanding of all instrumental effects. We are in the process of constructing a very fast, high fidelity simulator that will help us to assess instrument performance, support simulation-based data reduction, and improve our estimates of measurement error. We will combine and extend existing optics, detector, and electronics simulations. We will employ the Compute Unified Device Architecture (CUDA2) to parallelize these calculations. The price of suitable CUDA-compatible multi-giga op cores is about $0.20/core, so this approach will be very cost-effective.

  10. [Preparation of simulate craniocerebral models via three dimensional printing technique].

    PubMed

    Lan, Q; Chen, A L; Zhang, T; Zhu, Q; Xu, T

    2016-08-09

    Three dimensional (3D) printing technique was used to prepare the simulate craniocerebral models, which were applied to preoperative planning and surgical simulation. The image data was collected from PACS system. Image data of skull bone, brain tissue and tumors, cerebral arteries and aneurysms, and functional regions and relative neural tracts of the brain were extracted from thin slice scan (slice thickness 0.5 mm) of computed tomography (CT), magnetic resonance imaging (MRI, slice thickness 1mm), computed tomography angiography (CTA), and functional magnetic resonance imaging (fMRI) data, respectively. MIMICS software was applied to reconstruct colored virtual models by identifying and differentiating tissues according to their gray scales. Then the colored virtual models were submitted to 3D printer which produced life-sized craniocerebral models for surgical planning and surgical simulation. 3D printing craniocerebral models allowed neurosurgeons to perform complex procedures in specific clinical cases though detailed surgical planning. It offered great convenience for evaluating the size of spatial fissure of sellar region before surgery, which helped to optimize surgical approach planning. These 3D models also provided detailed information about the location of aneurysms and their parent arteries, which helped surgeons to choose appropriate aneurismal clips, as well as perform surgical simulation. The models further gave clear indications of depth and extent of tumors and their relationship to eloquent cortical areas and adjacent neural tracts, which were able to avoid surgical damaging of important neural structures. As a novel and promising technique, the application of 3D printing craniocerebral models could improve the surgical planning by converting virtual visualization into real life-sized models.It also contributes to functional anatomy study.

  11. Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources

    NASA Technical Reports Server (NTRS)

    Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John

    2000-01-01

    The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.

  12. Using Simulation to Examine the Effect of Physician Heterogeneity on the Operational Efficiency of an Overcrowded Hospital Emergency Department

    NASA Astrophysics Data System (ADS)

    Kuo, Y.-H.; Leung, J. M. Y.; Graham, C. A.

    2015-05-01

    In this paper, we present a case study of modelling and analyzing the patient flow of a hospital emergency department in Hong Kong. The emergency department is facing the challenge of overcrowding and the patients there usually experience a long waiting time. Our project team was requested by a senior consultant of the emergency department to analyze the patient flow and provide a decision support tool to help improve their operations. We adopt a simulation approach to mimic their daily operations. With the simulation model, we conduct a computational study to examine the effect of physician heterogeneity on the emergency department performance. We found that physician heterogeneity has a great impact on the operational efficiency and thus should be considered when developing simulation models. Our computational results show that, with the same average of service rates among the physicians, variation in the rates can improve overcrowding situation. This suggests that emergency departments may consider having some efficient physicians to speed up the overall service rate in return for more time for patients who need extra medical care.

  13. Enabling Co-Design of Multi-Layer Exascale Storage Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carothers, Christopher

    Growing demands for computing power in applications such as energy production, climate analysis, computational chemistry, and bioinformatics have propelled computing systems toward the exascale: systems with 10 18 floating-point operations per second. These systems, to be designed and constructed over the next decade, will create unprecedented challenges in component counts, power consumption, resource limitations, and system complexity. Data storage and access are an increasingly important and complex component in extreme-scale computing systems, and significant design work is needed to develop successful storage hardware and software architectures at exascale. Co-design of these systems will be necessary to find the best possiblemore » design points for exascale systems. The goal of this work has been to enable the exploration and co-design of exascale storage systems by providing a detailed, accurate, and highly parallel simulation of exascale storage and the surrounding environment. Specifically, this simulation has (1) portrayed realistic application checkpointing and analysis workloads, (2) captured the complexity, scale, and multilayer nature of exascale storage hardware and software, and (3) executed in a timeframe that enables “what if'” exploration of design concepts. We developed models of the major hardware and software components in an exascale storage system, as well as the application I/O workloads that drive them. We used our simulation system to investigate critical questions in reliability and concurrency at exascale, helping guide the design of future exascale hardware and software architectures. Additionally, we provided this system to interested vendors and researchers so that others can explore the design space. We validated the capabilities of our simulation environment by configuring the simulation to represent the Argonne Leadership Computing Facility Blue Gene/Q system and comparing simulation results for application I/O patterns to the results of executions of these I/O kernels on the actual system.« less

  14. Assessing Mission Impact of Cyberattacks: Report of the NATO IST-128 Workshop

    DTIC Science & Technology

    2015-12-01

    simulation) perspective. This would be natural, considering that the cybersecurity problem is highly adversarial in nature. Because it involves intelligent ...be formulated as a partial information game; artificial intelligence techniques might help here. Yet another style of problem formulation that...computational information processing for weapons, intelligence , communication, and logistics systems continues to increase the vulnerability of

  15. Fuels planning: science synthesis and integration; forest structure and fire hazard fact sheet 03: visualizing forest structure and fuels

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    The software described in this fact sheet provides managers with tools for visualizing forest and fuels information. Computer-based landscape simulations can help visualize stand and landscape conditions and the effects of different management treatments and fuel changes over time. These visualizations can assist forest planning by considering a range of management...

  16. A Computer Simulation to Help in Teaching Induction Phenomena

    ERIC Educational Resources Information Center

    Mihas, Pavlos

    2003-01-01

    The motion of a magnet through a coil is analysed through a model of magnetic monopoles. The magnetic flux of a monopole passing through a loop is explained and also its rate of change. By a superposition of voltages produced by the monopoles on the coils the shape of the voltage versus time graph is explained. Also examined is the interaction of…

  17. Indirect carbon reduction by residential vegetation and planting strategies in Chicago, U.S.A

    Treesearch

    H.K. Jo; E.G. McPherson

    2001-01-01

    Concern about climate change has evoked interest in the potential for urban vegetation to help reduce the levels of atmospheric carbon. This study applied computer simulations to try to quantify the modifying effects of existing vegetation on the indirect reduction of atmospheric carbon for two residential neighborhoods in north-west Chicago. The effects of shading,...

  18. Accounting for receptor flexibility and enhanced sampling methods in computer-aided drug design.

    PubMed

    Sinko, William; Lindert, Steffen; McCammon, J Andrew

    2013-01-01

    Protein flexibility plays a major role in biomolecular recognition. In many cases, it is not obvious how molecular structure will change upon association with other molecules. In proteins, these changes can be major, with large deviations in overall backbone structure, or they can be more subtle as in a side-chain rotation. Either way the algorithms that predict the favorability of biomolecular association require relatively accurate predictions of the bound structure to give an accurate assessment of the energy involved in association. Here, we review a number of techniques that have been proposed to accommodate receptor flexibility in the simulation of small molecules binding to protein receptors. We investigate modifications to standard rigid receptor docking algorithms and also explore enhanced sampling techniques, and the combination of free energy calculations and enhanced sampling techniques. The understanding and allowance for receptor flexibility are helping to make computer simulations of ligand protein binding more accurate. These developments may help improve the efficiency of drug discovery and development. Efficiency will be essential as we begin to see personalized medicine tailored to individual patients, which means specific drugs are needed for each patient's genetic makeup. © 2012 John Wiley & Sons A/S.

  19. iBIOMES Lite: Summarizing Biomolecular Simulation Data in Limited Settings

    PubMed Central

    2015-01-01

    As the amount of data generated by biomolecular simulations dramatically increases, new tools need to be developed to help manage this data at the individual investigator or small research group level. In this paper, we introduce iBIOMES Lite, a lightweight tool for biomolecular simulation data indexing and summarization. The main goal of iBIOMES Lite is to provide a simple interface to summarize computational experiments in a setting where the user might have limited privileges and limited access to IT resources. A command-line interface allows the user to summarize, publish, and search local simulation data sets. Published data sets are accessible via static hypertext markup language (HTML) pages that summarize the simulation protocols and also display data analysis graphically. The publication process is customized via extensible markup language (XML) descriptors while the HTML summary template is customized through extensible stylesheet language (XSL). iBIOMES Lite was tested on different platforms and at several national computing centers using various data sets generated through classical and quantum molecular dynamics, quantum chemistry, and QM/MM. The associated parsers currently support AMBER, GROMACS, Gaussian, and NWChem data set publication. The code is available at https://github.com/jcvthibault/ibiomes. PMID:24830957

  20. Shock simulations of a single-site coarse-grain RDX model using the dissipative particle dynamics method with reactivity

    NASA Astrophysics Data System (ADS)

    Sellers, Michael S.; Lísal, Martin; Schweigert, Igor; Larentzos, James P.; Brennan, John K.

    2017-01-01

    In discrete particle simulations, when an atomistic model is coarse-grained, a tradeoff is made: a boost in computational speed for a reduction in accuracy. The Dissipative Particle Dynamics (DPD) methods help to recover lost accuracy of the viscous and thermal properties, while giving back a relatively small amount of computational speed. Since its initial development for polymers, one of the most notable extensions of DPD has been the introduction of chemical reactivity, called DPD-RX. In 2007, Maillet, Soulard, and Stoltz introduced implicit chemical reactivity in DPD through the concept of particle reactors and simulated the decomposition of liquid nitromethane. We present an extended and generalized version of the DPD-RX method, and have applied it to solid hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX). Demonstration simulations of reacting RDX are performed under shock conditions using a recently developed single-site coarse-grain model and a reduced RDX decomposition mechanism. A description of the methods used to simulate RDX and its transition to hot product gases within DPD-RX is presented. Additionally, we discuss several examples of the effect of shock speed and microstructure on the corresponding material chemistry.

  1. Modeling and simulation of magnetic resonance imaging based on intermolecular multiple quantum coherences

    NASA Astrophysics Data System (ADS)

    Cai, Congbo; Dong, Jiyang; Cai, Shuhui; Cheng, En; Chen, Zhong

    2006-11-01

    Intermolecular multiple quantum coherences (iMQCs) have many potential applications since they can provide interaction information between different molecules within the range of dipolar correlation distance, and can provide new contrast in magnetic resonance imaging (MRI). Because of the non-localized property of dipolar field, and the non-linear property of the Bloch equations incorporating the dipolar field term, the evolution behavior of iMQC is difficult to deduce strictly in many cases. In such cases, simulation studies are very important. Simulation results can not only give a guide to optimize experimental conditions, but also help analyze unexpected experimental results. Based on our product operator matrix and the K-space method for dipolar field calculation, the MRI simulation software was constructed, running on Windows operation system. The non-linear Bloch equations are calculated by a fifth-order Cash-Karp Runge-Kutta formulism. Computational time can be efficiently reduced by separating the effects of chemical shifts and strong gradient field. Using this software, simulation of different kinds of complex MRI sequences can be done conveniently and quickly on general personal computers. Some examples were given. The results were discussed.

  2. WE-C-217BCD-08: Rapid Monte Carlo Simulations of DQE(f) of Scintillator-Based Detectors.

    PubMed

    Star-Lack, J; Abel, E; Constantin, D; Fahrig, R; Sun, M

    2012-06-01

    Monte Carlo simulations of DQE(f) can greatly aid in the design of scintillator-based detectors by helping optimize key parameters including scintillator material and thickness, pixel size, surface finish, and septa reflectivity. However, the additional optical transport significantly increases simulation times, necessitating a large number of parallel processors to adequately explore the parameter space. To address this limitation, we have optimized the DQE(f) algorithm, reducing simulation times per design iteration to 10 minutes on a single CPU. DQE(f) is proportional to the ratio, MTF(f)̂2 /NPS(f). The LSF-MTF simulation uses a slanted line source and is rapidly performed with relatively few gammas launched. However, the conventional NPS simulation for standard radiation exposure levels requires the acquisition of multiple flood fields (nRun), each requiring billions of input gamma photons (nGamma), many of which will scintillate, thereby producing thousands of optical photons (nOpt) per deposited MeV. The resulting execution time is proportional to the product nRun x nGamma x nOpt. In this investigation, we revisit the theoretical derivation of DQE(f), and reveal significant computation time savings through the optimization of nRun, nGamma, and nOpt. Using GEANT4, we determine optimal values for these three variables for a GOS scintillator-amorphous silicon portal imager. Both isotropic and Mie optical scattering processes were modeled. Simulation results were validated against the literature. We found that, depending on the radiative and optical attenuation properties of the scintillator, the NPS can be accurately computed using values for nGamma below 1000, and values for nOpt below 500/MeV. nRun should remain above 200. Using these parameters, typical computation times for a complete NPS ranged from 2-10 minutes on a single CPU. The number of launched particles and corresponding execution times for a DQE simulation can be dramatically reduced allowing for accurate computation with modest computer hardware. NIHRO1 CA138426. Several authors work for Varian Medical Systems. © 2012 American Association of Physicists in Medicine.

  3. Lattice Boltzmann computation of creeping fluid flow in roll-coating applications

    NASA Astrophysics Data System (ADS)

    Rajan, Isac; Kesana, Balashanker; Perumal, D. Arumuga

    2018-04-01

    Lattice Boltzmann Method (LBM) has advanced as a class of Computational Fluid Dynamics (CFD) methods used to solve complex fluid systems and heat transfer problems. It has ever-increasingly attracted the interest of researchers in computational physics to solve challenging problems of industrial and academic importance. In this current study, LBM is applied to simulate the creeping fluid flow phenomena commonly encountered in manufacturing technologies. In particular, we apply this novel method to simulate the fluid flow phenomena associated with the "meniscus roll coating" application. This prevalent industrial problem encountered in polymer processing and thin film coating applications is modelled as standard lid-driven cavity problem to which creeping flow analysis is applied. This incompressible viscous flow problem is studied in various speed ratios, the ratio of upper to lower lid speed in two different configurations of lid movement - parallel and anti-parallel wall motion. The flow exhibits interesting patterns which will help in design of roll coaters.

  4. Computer Simulations of the Tumor Vasculature: Applications to Interstitial Fluid Flow, Drug Delivery, and Oxygen Supply.

    PubMed

    Welter, Michael; Rieger, Heiko

    2016-01-01

    Tumor vasculature, the blood vessel network supplying a growing tumor with nutrients such as oxygen or glucose, is in many respects different from the hierarchically organized arterio-venous blood vessel network in normal tissues. Angiogenesis (the formation of new blood vessels), vessel cooption (the integration of existing blood vessels into the tumor vasculature), and vessel regression remodel the healthy vascular network into a tumor-specific vasculature. Integrative models, based on detailed experimental data and physical laws, implement, in silico, the complex interplay of molecular pathways, cell proliferation, migration, and death, tissue microenvironment, mechanical and hydrodynamic forces, and the fine structure of the host tissue vasculature. With the help of computer simulations high-precision information about blood flow patterns, interstitial fluid flow, drug distribution, oxygen and nutrient distribution can be obtained and a plethora of therapeutic protocols can be tested before clinical trials. This chapter provides an overview over the current status of computer simulations of vascular remodeling during tumor growth including interstitial fluid flow, drug delivery, and oxygen supply within the tumor. The model predictions are compared with experimental and clinical data and a number of longstanding physiological paradigms about tumor vasculature and intratumoral solute transport are critically scrutinized.

  5. Dynamic VMs placement for energy efficiency by PSO in cloud computing

    NASA Astrophysics Data System (ADS)

    Dashti, Seyed Ebrahim; Rahmani, Amir Masoud

    2016-03-01

    Recently, cloud computing is growing fast and helps to realise other high technologies. In this paper, we propose a hieratical architecture to satisfy both providers' and consumers' requirements in these technologies. We design a new service in the PaaS layer for scheduling consumer tasks. In the providers' perspective, incompatibility between specification of physical machine and user requests in cloud leads to problems such as energy-performance trade-off and large power consumption so that profits are decreased. To guarantee Quality of service of users' tasks, and reduce energy efficiency, we proposed to modify Particle Swarm Optimisation to reallocate migrated virtual machines in the overloaded host. We also dynamically consolidate the under-loaded host which provides power saving. Simulation results in CloudSim demonstrated that whatever simulation condition is near to the real environment, our method is able to save as much as 14% more energy and the number of migrations and simulation time significantly reduces compared with the previous works.

  6. Effective height of chimney for biomass cook stove simulated by computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Faisal; Setiawan, A.; Wusnah; Khairil; Luthfi

    2018-02-01

    This paper presents the results of numerical modelling of temperature distribution and flow pattern in a biomass cooking stove using CFD simulation. The biomass stove has been designed to suite the household cooking process. The stove consists of two pots. The first is the main pot located on the top of the combustion chamber where the heat from the combustion process is directly received. The second pot absorbs the heat from the exhaust gas. A chimney installed at the end of the stove releases the exhaust gas to the ambient air. During the tests, the height of chimney was varied to find the highest temperatures at both pots. Results showed that the height of the chimney at the highest temperatures of the pots is 1.65 m. This chimney height was validated by developing a model for computational fluid dynamics. Both experimental and simulations results show a good agreement and help in tune-fining the design of biomass cooking stove.

  7. Using a Computer Simulation to Improve Psychological Readiness for Job Interviewing in Unemployed Individuals of Pre-Retirement Age

    PubMed Central

    Aysina, Rimma M.; Efremova, Galina I.; Maksimenko, Zhanna A.; Nikiforov, Mikhail V.

    2017-01-01

    Unemployed individuals of pre-retirement age face significant challenges in finding a new job. This may be partly due to their lack of psychological readiness to go through a job interview. We view psychological readiness as one of the psychological attitude components. It is an active conscious readiness to interact with a certain aspect of reality, based on previously acquired experience. It includes a persons’ special competence to manage their activities and cope with anxiety. We created Job Interview Simulation Training (JIST) – a computer-based simulator, which allowed unemployed job seekers to practice interviewing repeatedly in a stress-free environment. We hypothesized that completion of JIST would be related to increase in pre-retirement job seekers’ psychological readiness for job interviewing in real life. Participants were randomized into control (n = 18) and experimental (n = 21) conditions. Both groups completed pre- and post-intervention job interview role-plays and self-reporting forms of psychological readiness for job interviewing. JIST consisted of 5 sessions of a simulated job interview, and the experimental group found it easy to use and navigate as well as helpful to prepare for interviewing. After finishing JIST-sessions the experimental group had significant decrease in heart rate during the post-intervention role-play and demonstrated significant increase in their self-rated psychological readiness, whereas the control group did not have changes in these variables. Future research may help clarify whether JIST is related to an increase in re-employment of pre-retirement job seekers. PMID:28580025

  8. Using a Computer Simulation to Improve Psychological Readiness for Job Interviewing in Unemployed Individuals of Pre-Retirement Age.

    PubMed

    Aysina, Rimma M; Efremova, Galina I; Maksimenko, Zhanna A; Nikiforov, Mikhail V

    2017-05-01

    Unemployed individuals of pre-retirement age face significant challenges in finding a new job. This may be partly due to their lack of psychological readiness to go through a job interview. We view psychological readiness as one of the psychological attitude components. It is an active conscious readiness to interact with a certain aspect of reality, based on previously acquired experience. It includes a persons' special competence to manage their activities and cope with anxiety. We created Job Interview Simulation Training (JIST) - a computer-based simulator, which allowed unemployed job seekers to practice interviewing repeatedly in a stress-free environment. We hypothesized that completion of JIST would be related to increase in pre-retirement job seekers' psychological readiness for job interviewing in real life. Participants were randomized into control (n = 18) and experimental (n = 21) conditions. Both groups completed pre- and post-intervention job interview role-plays and self-reporting forms of psychological readiness for job interviewing. JIST consisted of 5 sessions of a simulated job interview, and the experimental group found it easy to use and navigate as well as helpful to prepare for interviewing. After finishing JIST-sessions the experimental group had significant decrease in heart rate during the post-intervention role-play and demonstrated significant increase in their self-rated psychological readiness, whereas the control group did not have changes in these variables. Future research may help clarify whether JIST is related to an increase in re-employment of pre-retirement job seekers.

  9. An application for multi-person task synchronization

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee

    1990-01-01

    Computer applications are studied that will enable a group of people to synchronize their actions when following a predefined task sequence. It is assumed that the people involved only have computer workstations available to them for communication. Hence, the approach is to study how the computer can be used to help a group remain synchronized. A series of applications were designed and developed that can be used as vehicles for experimentation. An example of how this technique can be used for a remote coaching capability is explained in a report describing an experiment that simulated a Life Sciences experiment on-board Space Station Freedom, with a ground based principal investigator providing the expertise by coaching the on-orbit mission specialist.

  10. Computational neurobiology is a useful tool in translational neurology: the example of ataxia

    PubMed Central

    Brown, Sherry-Ann; McCullough, Louise D.; Loew, Leslie M.

    2014-01-01

    Hereditary ataxia, or motor incoordination, affects approximately 150,000 Americans and hundreds of thousands of individuals worldwide with onset from as early as mid-childhood. Affected individuals exhibit dysarthria, dysmetria, action tremor, and diadochokinesia. In this review, we consider an array of computational studies derived from experimental observations relevant to human neuropathology. A survey of related studies illustrates the impact of integrating clinical evidence with data from mouse models and computational simulations. Results from these studies may help explain findings in mice, and after extensive laboratory study, may ultimately be translated to ataxic individuals. This inquiry lays a foundation for using computation to understand neurobiochemical and electrophysiological pathophysiology of spinocerebellar ataxias and may contribute to development of therapeutics. The interdisciplinary analysis suggests that computational neurobiology can be an important tool for translational neurology. PMID:25653585

  11. Radio Emmision during the interaction of two Interplanetary Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Lara, Alejandro; Niembro, Tatiana; González, Ricardo

    2016-07-01

    We show that some sporadic radio emission observed by the WIND/WAVES experiment in the decametric/kilometric bands are due to the interaction of two interplanetary Coronal Mass Ejections. We have performed hydrodynamic simulations of the evolution of two consecutive Coronal Mass ejections in the interplanetary medium. With these simulations it is possible to follow the density evolution of the merged structure, and therefore, compute the frequency limits of the possible plasma emission. We study four well documented ICME interaction events, and found radio emission at the time and frequencies predicted by the simulations. This emission may help to anticipate the complexity of the merged region before it reaches one AU.

  12. Soft Computing Techniques for the Protein Folding Problem on High Performance Computing Architectures.

    PubMed

    Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M

    2016-01-01

    The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.

  13. A hybrid genetic-simulated annealing algorithm for the location-inventory-routing problem considering returns under e-supply chain environment.

    PubMed

    Li, Yanhui; Guo, Hao; Wang, Lin; Fu, Jing

    2013-01-01

    Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.

  14. A Numerical Analysis of Heat Transfer and Effectiveness on Film Cooled Turbine Blade Tip Models

    NASA Technical Reports Server (NTRS)

    Ameri, A. A.; Rigby, D. L.

    1999-01-01

    A computational study has been performed to predict the distribution of convective heat transfer coefficient on a simulated blade tip with cooling holes. The purpose of the examination was to assess the ability of a three-dimensional Reynolds-averaged Navier-Stokes solver to predict the rate of tip heat transfer and the distribution of cooling effectiveness. To this end, the simulation of tip clearance flow with blowing of Kim and Metzger was used. The agreement of the computed effectiveness with the data was quite good. The agreement with the heat transfer coefficient was not as good but improved away from the cooling holes. Numerical flow visualization showed that the uniformity of wetting of the surface by the film cooling jet is helped by the reverse flow due to edge separation of the main flow.

  15. Using Model Replication to Improve the Reliability of Agent-Based Models

    NASA Astrophysics Data System (ADS)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  16. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  17. Simulating the X-Ray Image Contrast to Set-Up Techniques with Desired Flaw Detectability

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2015-01-01

    The paper provides simulation data of previous work by the author in developing a model for estimating detectability of crack-like flaws in radiography. The methodology is being developed to help in implementation of NASA Special x-ray radiography qualification, but is generically applicable to radiography. The paper describes a method for characterizing X-ray detector resolution for crack detection. Applicability of ASTM E 2737 resolution requirements to the model are also discussed. The paper describes a model for simulating the detector resolution. A computer calculator application, discussed here, also performs predicted contrast and signal-to-noise ratio calculations. Results of various simulation runs in calculating x-ray flaw size parameter and image contrast for varying input parameters such as crack depth, crack width, part thickness, x-ray angle, part-to-detector distance, part-to-source distance, source sizes, and detector sensitivity and resolution are given as 3D surfaces. These results demonstrate effect of the input parameters on the flaw size parameter and the simulated image contrast of the crack. These simulations demonstrate utility of the flaw size parameter model in setting up x-ray techniques that provide desired flaw detectability in radiography. The method is applicable to film radiography, computed radiography, and digital radiography.

  18. Monte Carlo simulations in X-ray imaging

    NASA Astrophysics Data System (ADS)

    Giersch, Jürgen; Durst, Jürgen

    2008-06-01

    Monte Carlo simulations have become crucial tools in many fields of X-ray imaging. They help to understand the influence of physical effects such as absorption, scattering and fluorescence of photons in different detector materials on image quality parameters. They allow studying new imaging concepts like photon counting, energy weighting or material reconstruction. Additionally, they can be applied to the fields of nuclear medicine to define virtual setups studying new geometries or image reconstruction algorithms. Furthermore, an implementation of the propagation physics of electrons and photons allows studying the behavior of (novel) X-ray generation concepts. This versatility of Monte Carlo simulations is illustrated with some examples done by the Monte Carlo simulation ROSI. An overview of the structure of ROSI is given as an example of a modern, well-proven, object-oriented, parallel computing Monte Carlo simulation for X-ray imaging.

  19. Operating system for a real-time multiprocessor propulsion system simulator. User's manual

    NASA Technical Reports Server (NTRS)

    Cole, G. L.

    1985-01-01

    The NASA Lewis Research Center is developing and evaluating experimental hardware and software systems to help meet future needs for real-time, high-fidelity simulations of air-breathing propulsion systems. Specifically, the real-time multiprocessor simulator project focuses on the use of multiple microprocessors to achieve the required computing speed and accuracy at relatively low cost. Operating systems for such hardware configurations are generally not available. A real time multiprocessor operating system (RTMPOS) that supports a variety of multiprocessor configurations was developed at Lewis. With some modification, RTMPOS can also support various microprocessors. RTMPOS, by means of menus and prompts, provides the user with a versatile, user-friendly environment for interactively loading, running, and obtaining results from a multiprocessor-based simulator. The menu functions are described and an example simulation session is included to demonstrate the steps required to go from the simulation loading phase to the execution phase.

  20. 2011 Computation Directorate Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less

  1. Program Helps Simulate Neural Networks

    NASA Technical Reports Server (NTRS)

    Villarreal, James; Mcintire, Gary

    1993-01-01

    Neural Network Environment on Transputer System (NNETS) computer program provides users high degree of flexibility in creating and manipulating wide variety of neural-network topologies at processing speeds not found in conventional computing environments. Supports back-propagation and back-propagation-related algorithms. Back-propagation algorithm used is implementation of Rumelhart's generalized delta rule. NNETS developed on INMOS Transputer(R). Predefines back-propagation network, Jordan network, and reinforcement network to assist users in learning and defining own networks. Also enables users to configure other neural-network paradigms from NNETS basic architecture. Small portion of software written in OCCAM(R) language.

  2. A mathematical model and computational framework for three-dimensional chondrocyte cell growth in a porous tissue scaffold placed inside a bi-directional flow perfusion bioreactor.

    PubMed

    Shakhawath Hossain, Md; Bergstrom, D J; Chen, X B

    2015-12-01

    The in vitro chondrocyte cell culture for cartilage tissue regeneration in a perfusion bioreactor is a complex process. Mathematical modeling and computational simulation can provide important insights into the culture process, which would be helpful for selecting culture conditions to improve the quality of the developed tissue constructs. However, simulation of the cell culture process is a challenging task due to the complicated interaction between the cells and local fluid flow and nutrient transport inside the complex porous scaffolds. In this study, a mathematical model and computational framework has been developed to simulate the three-dimensional (3D) cell growth in a porous scaffold placed inside a bi-directional flow perfusion bioreactor. The model was developed by taking into account the two-way coupling between the cell growth and local flow field and associated glucose concentration, and then used to perform a resolved-scale simulation based on the lattice Boltzmann method (LBM). The simulation predicts the local shear stress, glucose concentration, and 3D cell growth inside the porous scaffold for a period of 30 days of cell culture. The predicted cell growth rate was in good overall agreement with the experimental results available in the literature. This study demonstrates that the bi-directional flow perfusion culture system can enhance the homogeneity of the cell growth inside the scaffold. The model and computational framework developed is capable of providing significant insight into the culture process, thus providing a powerful tool for the design and optimization of the cell culture process. © 2015 Wiley Periodicals, Inc.

  3. A low noise discrete velocity method for the Boltzmann equation with quantized rotational and vibrational energy

    NASA Astrophysics Data System (ADS)

    Clarke, Peter; Varghese, Philip; Goldstein, David

    2018-01-01

    A discrete velocity method is developed for gas mixtures of diatomic molecules with both rotational and vibrational energy states. A full quantized model is described, and rotation-translation and vibration-translation energy exchanges are simulated using a Larsen-Borgnakke exchange model. Elastic and inelastic molecular interactions are modeled during every simulated collision to help produce smooth internal energy distributions. The method is verified by comparing simulations of homogeneous relaxation by our discrete velocity method to numerical solutions of the Jeans and Landau-Teller equations, and to direct simulation Monte Carlo. We compute the structure of a 1D shock using this method, and determine how the rotational energy distribution varies with spatial location in the shock and with position in velocity space.

  4. Challenges in Computational Social Modeling and Simulation for National Security Decision Making

    DTIC Science & Technology

    2011-06-01

    This study is grounded within a system-activity theory , a logico-philosophical model of interdisciplinary research [13, 14], the concepts of social...often a difficult challenge. Ironically, social science research methods , such as ethnography , may be tremendously helpful in designing these...social sciences. Moreover, CSS projects draw on knowledge and methods from other fields of study , including graph theory , information visualization

  5. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.

  6. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  7. Computational simulation of extravehicular activity dynamics during a satellite capture attempt.

    PubMed

    Schaffner, G; Newman, D J; Robinson, S K

    2000-01-01

    A more quantitative approach to the analysis of astronaut extravehicular activity (EVA) tasks is needed because of their increasing complexity, particularly in preparation for the on-orbit assembly of the International Space Station. Existing useful EVA computer analyses produce either high-resolution three-dimensional computer images based on anthropometric representations or empirically derived predictions of astronaut strength based on lean body mass and the position and velocity of body joints but do not provide multibody dynamic analysis of EVA tasks. Our physics-based methodology helps fill the current gap in quantitative analysis of astronaut EVA by providing a multisegment human model and solving the equations of motion in a high-fidelity simulation of the system dynamics. The simulation work described here improves on the realism of previous efforts by including three-dimensional astronaut motion, incorporating joint stops to account for the physiological limits of range of motion, and incorporating use of constraint forces to model interaction with objects. To demonstrate the utility of this approach, the simulation is modeled on an actual EVA task, namely, the attempted capture of a spinning Intelsat VI satellite during STS-49 in May 1992. Repeated capture attempts by an EVA crewmember were unsuccessful because the capture bar could not be held in contact with the satellite long enough for the capture latches to fire and successfully retrieve the satellite.

  8. Computational modeling of radiofrequency ablation: evaluation on ex vivo data using ultrasound monitoring

    NASA Astrophysics Data System (ADS)

    Audigier, Chloé; Kim, Younsu; Dillow, Austin; Boctor, Emad M.

    2017-03-01

    Radiofrequency ablation (RFA) is the most widely used minimally invasive ablative therapy for liver cancer, but it is challenged by a lack of patient-specific monitoring. Inter-patient tissue variability and the presence of blood vessels make the prediction of the RFA difficult. A monitoring tool which can be personalized for a given patient during the intervention would be helpful to achieve a complete tumor ablation. However, the clinicians do not have access to such a tool, which results in incomplete treatment and a large number of recurrences. Computational models can simulate the phenomena and mechanisms governing this therapy. The temperature evolution as well as the resulted ablation can be modeled. When combined together with intraoperative measurements, computational modeling becomes an accurate and powerful tool to gain quantitative understanding and to enable improvements in the ongoing clinical settings. This paper shows how computational models of RFA can be evaluated using intra-operative measurements. First, simulations are used to demonstrate the feasibility of the method, which is then evaluated on two ex vivo datasets. RFA is simulated on a simplified geometry to generate realistic longitudinal temperature maps and the resulted necrosis. Computed temperatures are compared with the temperature evolution recorded using thermometers, and with temperatures monitored by ultrasound (US) in a 2D plane containing the ablation tip. Two ablations are performed on two cadaveric bovine livers, and we achieve error of 2.2 °C on average between the computed and the thermistors temperature and 1.4 °C and 2.7 °C on average between the temperature computed and monitored by US during the ablation at two different time points (t = 240 s and t = 900 s).

  9. Transitional hemodynamics in intracranial aneurysms - Comparative velocity investigations with high resolution lattice Boltzmann simulations, normal resolution ANSYS simulations, and MR imaging.

    PubMed

    Jain, Kartik; Jiang, Jingfeng; Strother, Charles; Mardal, Kent-André

    2016-11-01

    Blood flow in intracranial aneurysms has, until recently, been considered to be disturbed but still laminar. Recent high resolution computational studies have demonstrated, in some situations, however, that the flow may exhibit high frequency fluctuations that resemble weakly turbulent or transitional flow. Due to numerous assumptions required for simplification in computational fluid dynamics (CFD) studies, the occurrence of these events, in vivo, remains unsettled. The detection of these fluctuations in aneurysmal blood flow, i.e., hemodynamics by CFD, poses additional challenges as such phenomena cannot be captured in clinical data acquisition with magnetic resonance (MR) due to inadequate temporal and spatial resolutions. The authors' purpose was to address this issue by comparing results from highly resolved simulations, conventional resolution laminar simulations, and MR measurements, identify the differences, and identify their causes. Two aneurysms in the basilar artery, one with disturbed yet laminar flow and the other with transitional flow, were chosen. One set of highly resolved direct numerical simulations using the lattice Boltzmann method (LBM) and another with adequate resolutions under laminar flow assumption were conducted using a commercially available ANSYS Fluent solver. The velocity fields obtained from simulation results were qualitatively and statistically compared against each other and with MR acquisition. Results from LBM, ANSYS Fluent, and MR agree well qualitatively and quantitatively for one of the aneurysms with laminar flow in which fluctuations were <80 Hz. The comparisons for the second aneurysm with high fluctuations of > ∼ 600 Hz showed vivid differences between LBM, ANSYS Fluent, and magnetic resonance imaging. After ensemble averaging and down-sampling to coarser space and time scales, these differences became minimal. A combination of MR derived data and CFD can be helpful in estimating the hemodynamic environment of intracranial aneurysms. Adequately resolved CFD would suffice gross assessment of hemodynamics, potentially in a clinical setting, and highly resolved CFD could be helpful in a detailed and retrospective understanding of the physiological mechanisms.

  10. The effect of force feedback on student reasoning about gravity, mass, force and motion

    NASA Astrophysics Data System (ADS)

    Bussell, Linda

    The purpose of this study was to examine whether force feedback within a computer simulation had an effect on reasoning by fifth grade students about gravity, mass, force, and motion, concepts which can be difficult for learners to grasp. Few studies have been done on cognitive learning and haptic feedback, particularly with young learners, but there is an extensive base of literature on children's conceptions of science and a number of studies focus specifically on children's conceptions of force and motion. This case study used a computer-based paddleball simulation with guided inquiry as the primary stimulus. Within the simulation, the learner could adjust the mass of the ball and the gravitational force. The experimental group used the simulation with visual and force feedback; the control group used the simulation with visual feedback but without force feedback. The proposition was that there would be differences in reasoning between the experimental and control groups, with force feedback being helpful with concepts that are more obvious when felt. Participants were 34 fifth-grade students from three schools. Students completed a modal (visual, auditory, and haptic) learning preference assessment and a pretest. The sessions, including participant experimentation and interviews, were audio recorded and observed. The interviews were followed by a written posttest. These data were analyzed to determine whether there were differences based on treatment, learning style, demographics, prior gaming experience, force feedback experience, or prior knowledge. Work with the simulation, regardless of group, was found to increase students' understanding of key concepts. The experimental group appeared to benefit from the supplementary help that force feedback provided. Those in the experimental group scored higher on the posttest than those in the control group. The greatest difference between mean group scores was on a question concerning the effects of increased gravitational force.

  11. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  12. Demo of three ways to use a computer to assist in lab

    NASA Technical Reports Server (NTRS)

    Neville, J. P.

    1990-01-01

    The objective is to help the slow learner and students with a language problem, or to challenge the advanced student. Technology has advanced to the point where images generated on a computer can easily be recorded on a VCR and used as a video tutorial. This transfer can be as simple as pointing a video camera at the screen and recording the image. For more clarity and professional results, a board may be inserted into a computer which will convert the signals directly to the TV standard. Using a computer program that generates movies one can animate various principles which would normally be impossible to show or would require time-lapse photography. For example, you might show the change in shape of grains as a piece of metal is cold worked and then show the recrystallization and grain growth as heat is applied. More imaginative titles and graphics are also possible using this technique. Remedial help may also be offered via computer to those who find a specific concept difficult. A printout of specific data, details of the theory or equipment set-up can be offered. Programs are now available that will help as well as test the student in specific areas so that a Keller type approach can be used with each student to insure each knows the subject before going on to the next topic. A computer can serve as an information source and contain the microstructures, physical data and availability of each material tested in the lab. With this source present unknowns can be evaluated and various tests simulated to create a simple or complex case study lab assignment.

  13. Three-Dimensional Liver Surgery Simulation: Computer-Assisted Surgical Planning with Three-Dimensional Simulation Software and Three-Dimensional Printing.

    PubMed

    Oshiro, Yukio; Ohkohchi, Nobuhiro

    2017-06-01

    To perform accurate hepatectomy without injury, it is necessary to understand the anatomical relationship among the branches of Glisson's sheath, hepatic veins, and tumor. In Japan, three-dimensional (3D) preoperative simulation for liver surgery is becoming increasingly common, and liver 3D modeling and 3D hepatectomy simulation by 3D analysis software for liver surgery have been covered by universal healthcare insurance since 2012. Herein, we review the history of virtual hepatectomy using computer-assisted surgery (CAS) and our research to date, and we discuss the future prospects of CAS. We have used the SYNAPSE VINCENT medical imaging system (Fujifilm Medical, Tokyo, Japan) for 3D visualization and virtual resection of the liver since 2010. We developed a novel fusion imaging technique combining 3D computed tomography (CT) with magnetic resonance imaging (MRI). The fusion image enables us to easily visualize anatomic relationships among the hepatic arteries, portal veins, bile duct, and tumor in the hepatic hilum. In 2013, we developed an original software, called Liversim, which enables real-time deformation of the liver using physical simulation, and a randomized control trial has recently been conducted to evaluate the use of Liversim and SYNAPSE VINCENT for preoperative simulation and planning. Furthermore, we developed a novel hollow 3D-printed liver model whose surface is covered with frames. This model is useful for safe liver resection, has better visibility, and the production cost is reduced to one-third of a previous model. Preoperative simulation and navigation with CAS in liver resection are expected to help planning and conducting a surgery and surgical education. Thus, a novel CAS system will contribute to not only the performance of reliable hepatectomy but also to surgical education.

  14. Hybrid neuro-heuristic methodology for simulation and control of dynamic systems over time interval.

    PubMed

    Woźniak, Marcin; Połap, Dawid

    2017-09-01

    Simulation and positioning are very important aspects of computer aided engineering. To process these two, we can apply traditional methods or intelligent techniques. The difference between them is in the way they process information. In the first case, to simulate an object in a particular state of action, we need to perform an entire process to read values of parameters. It is not very convenient for objects for which simulation takes a long time, i.e. when mathematical calculations are complicated. In the second case, an intelligent solution can efficiently help on devoted way of simulation, which enables us to simulate the object only in a situation that is necessary for a development process. We would like to present research results on developed intelligent simulation and control model of electric drive engine vehicle. For a dedicated simulation method based on intelligent computation, where evolutionary strategy is simulating the states of the dynamic model, an intelligent system based on devoted neural network is introduced to control co-working modules while motion is in time interval. Presented experimental results show implemented solution in situation when a vehicle transports things over area with many obstacles, what provokes sudden changes in stability that may lead to destruction of load. Therefore, applied neural network controller prevents the load from destruction by positioning characteristics like pressure, acceleration, and stiffness voltage to absorb the adverse changes of the ground. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The impact of home computer use on children's activities and development.

    PubMed

    Subrahmanyam, K; Kraut, R E; Greenfield, P M; Gross, E F

    2000-01-01

    The increasing amount of time children are spending on computers at home and school has raised questions about how the use of computer technology may make a difference in their lives--from helping with homework to causing depression to encouraging violent behavior. This article provides an overview of the limited research on the effects of home computer use on children's physical, cognitive, and social development. Initial research suggests, for example, that access to computers increases the total amount of time children spend in front of a television or computer screen at the expense of other activities, thereby putting them at risk for obesity. At the same time, cognitive research suggests that playing computer games can be an important building block to computer literacy because it enhances children's ability to read and visualize images in three-dimensional space and track multiple images simultaneously. The limited evidence available also indicates that home computer use is linked to slightly better academic performance. The research findings are more mixed, however, regarding the effects on children's social development. Although little evidence indicates that the moderate use of computers to play games has a negative impact on children's friendships and family relationships, recent survey data show that increased use of the Internet may be linked to increases in loneliness and depression. Of most concern are the findings that playing violent computer games may increase aggressiveness and desensitize a child to suffering, and that the use of computers may blur a child's ability to distinguish real life from simulation. The authors conclude that more systematic research is needed in these areas to help parents and policymakers maximize the positive effects and to minimize the negative effects of home computers in children's lives.

  16. Development of space simulation / net-laboratory system

    NASA Astrophysics Data System (ADS)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  17. An automated and reproducible workflow for running and analyzing neural simulations using Lancet and IPython Notebook

    PubMed Central

    Stevens, Jean-Luc R.; Elver, Marco; Bednar, James A.

    2013-01-01

    Lancet is a new, simulator-independent Python utility for succinctly specifying, launching, and collating results from large batches of interrelated computationally demanding program runs. This paper demonstrates how to combine Lancet with IPython Notebook to provide a flexible, lightweight, and agile workflow for fully reproducible scientific research. This informal and pragmatic approach uses IPython Notebook to capture the steps in a scientific computation as it is gradually automated and made ready for publication, without mandating the use of any separate application that can constrain scientific exploration and innovation. The resulting notebook concisely records each step involved in even very complex computational processes that led to a particular figure or numerical result, allowing the complete chain of events to be replicated automatically. Lancet was originally designed to help solve problems in computational neuroscience, such as analyzing the sensitivity of a complex simulation to various parameters, or collecting the results from multiple runs with different random starting points. However, because it is never possible to know in advance what tools might be required in future tasks, Lancet has been designed to be completely general, supporting any type of program as long as it can be launched as a process and can return output in the form of files. For instance, Lancet is also heavily used by one of the authors in a separate research group for launching batches of microprocessor simulations. This general design will allow Lancet to continue supporting a given research project even as the underlying approaches and tools change. PMID:24416014

  18. Estimation of in-situ bioremediation system cost using a hybrid Extreme Learning Machine (ELM)-particle swarm optimization approach

    NASA Astrophysics Data System (ADS)

    Yadav, Basant; Ch, Sudheer; Mathur, Shashi; Adamowski, Jan

    2016-12-01

    In-situ bioremediation is the most common groundwater remediation procedure used for treating organically contaminated sites. A simulation-optimization approach, which incorporates a simulation model for groundwaterflow and transport processes within an optimization program, could help engineers in designing a remediation system that best satisfies management objectives as well as regulatory constraints. In-situ bioremediation is a highly complex, non-linear process and the modelling of such a complex system requires significant computational exertion. Soft computing techniques have a flexible mathematical structure which can generalize complex nonlinear processes. In in-situ bioremediation management, a physically-based model is used for the simulation and the simulated data is utilized by the optimization model to optimize the remediation cost. The recalling of simulator to satisfy the constraints is an extremely tedious and time consuming process and thus there is need for a simulator which can reduce the computational burden. This study presents a simulation-optimization approach to achieve an accurate and cost effective in-situ bioremediation system design for groundwater contaminated with BTEX (Benzene, Toluene, Ethylbenzene, and Xylenes) compounds. In this study, the Extreme Learning Machine (ELM) is used as a proxy simulator to replace BIOPLUME III for the simulation. The selection of ELM is done by a comparative analysis with Artificial Neural Network (ANN) and Support Vector Machine (SVM) as they were successfully used in previous studies of in-situ bioremediation system design. Further, a single-objective optimization problem is solved by a coupled Extreme Learning Machine (ELM)-Particle Swarm Optimization (PSO) technique to achieve the minimum cost for the in-situ bioremediation system design. The results indicate that ELM is a faster and more accurate proxy simulator than ANN and SVM. The total cost obtained by the ELM-PSO approach is held to a minimum while successfully satisfying all the regulatory constraints of the contaminated site.

  19. The nature of undergraduates' conceptual understanding of oxygen transport and utilization in humans: Can cardiopulmonary simulation software enhance learning of propositional knowledge and/or diagnose alternative conceptions in novices and intermediates?

    NASA Astrophysics Data System (ADS)

    Wissing, Dennis Robert

    The purpose of the this research was to explore undergraduates' conceptual development for oxygen transport and utilization, as a component of a cardiopulmonary physiology and advanced respiratory care course in the allied health program. This exploration focused on the student's development of knowledge and the presence of alternative conceptions, prior to, during, and after completing cardiopulmonary physiology and advanced respiratory care courses. Using the simulation program, SimBioSysTM (Samsel, 1994), student-participants completed a series of laboratory exercises focusing on cardiopulmonary disease states. This study examined data gathered from: (1) a novice group receiving the simulation program prior to instruction, (2) a novice group that experienced the simulation program following course completion in cardiopulmonary physiology, and (3) an intermediate group who experienced the simulation program following completion of formal education in Respiratory Care. This research was based on the theory of Human Constructivism as described by Mintzes, Wandersee, and Novak (1997). Data-gathering techniques were based on theories supported by Novak (1984), Wandersee (1997), and Chi (1997). Data were generated by exams, interviews, verbal analysis (Chi, 1997), and concept mapping. Results suggest that simulation may be an effective instructional method for assessing conceptual development and diagnosing alternative conceptions in undergraduates enrolled in a cardiopulmonary science program. Use of simulation in conjunction with clinical interview and concept mapping may assist in verifying gaps in learning and conceptual knowledge. This study found only limited evidence to support the use of computer simulation prior to lecture to augment learning. However, it was demonstrated that students' prelecture experience with the computer simulation helped the instructor assess what the learner knew so he or she could be taught accordingly. In addition, use of computer simulation after formal instruction was shown to be useful in aiding students identified by the instructor as needing remediation.

  20. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.

  1. Teaching Tip: Development of Veterinary Anesthesia Simulations for Pre-Clinical Training: Design, Implementation, and Evaluation Based on Student Perspectives.

    PubMed

    Jones, Jana L; Rinehart, Jim; Spiegel, Jacqueline Jordan; Englar, Ryane E; Sidaway, Brian K; Rowles, Joie

    2018-01-01

    Anesthesia simulations have been used in pre-clinical medical training for decades to help learners gain confidence and expertise in an operating room environment without danger to a live patient. The authors describe a veterinary anesthesia simulation environment (VASE) with anesthesia scenarios developed to provide a re-creation of a veterinarian's task environment while performing anesthesia. The VASE uses advanced computer technology with simulator inputs provided from standard monitoring equipment in common use during veterinary anesthesia and a commercial canine training mannequin that allows intubation, ventilation, and venous access. The simulation outputs are determined by a script that outlines routine anesthesia scenarios and describes the consequences of students' hands-on actions and interventions during preestablished anesthetic tasks and critical incidents. Patients' monitored physiologic parameters may be changed according to predetermined learner events and students' interventions to provide immediate learner feedback and clinical realism. A total of 96 students from the pre-clinical anesthesia course participated in the simulations and the pre- and post-simulation surveys evaluating students' perspectives. Results of the surveys and comparisons of overall categorical cumulative responses in the pre- and post-simulation surveys indicated improvement in learners' perceived preparedness and confidence as a result of the simulated anesthesia experience, with significant improvement in the strongly agree, moderately agree, and agree categories (p<.05 at a 95% CI). These results suggest that anesthesia simulations in the VASE may complement traditional teaching methods through experiential learning and may help foster classroom-to-clinic transference of knowledge and skills without harm to an animal.

  2. Computational modeling of heterogeneity and function of CD4+ T cells

    PubMed Central

    Carbo, Adria; Hontecillas, Raquel; Andrew, Tricity; Eden, Kristin; Mei, Yongguo; Hoops, Stefan; Bassaganya-Riera, Josep

    2014-01-01

    The immune system is composed of many different cell types and hundreds of intersecting molecular pathways and signals. This large biological complexity requires coordination between distinct pro-inflammatory and regulatory cell subsets to respond to infection while maintaining tissue homeostasis. CD4+ T cells play a central role in orchestrating immune responses and in maintaining a balance between pro- and anti- inflammatory responses. This tight balance between regulatory and effector reactions depends on the ability of CD4+ T cells to modulate distinct pathways within large molecular networks, since dysregulated CD4+ T cell responses may result in chronic inflammatory and autoimmune diseases. The CD4+ T cell differentiation process comprises an intricate interplay between cytokines, their receptors, adaptor molecules, signaling cascades and transcription factors that help delineate cell fate and function. Computational modeling can help to describe, simulate, analyze, and predict some of the behaviors in this complicated differentiation network. This review provides a comprehensive overview of existing computational immunology methods as well as novel strategies used to model immune responses with a particular focus on CD4+ T cell differentiation. PMID:25364738

  3. Computational Enzymology and Organophosphorus Degrading Enzymes: Promising Approaches Toward Remediation Technologies of Warfare Agents and Pesticides.

    PubMed

    Ramalho, Teodorico C; de Castro, Alexandre A; Silva, Daniela R; Silva, Maria Cristina; Franca, Tanos C C; Bennion, Brian J; Kuca, Kamil

    2016-01-01

    The re-emergence of chemical weapons as a global threat in hands of terrorist groups, together with an increasing number of pesticides intoxications and environmental contaminations worldwide, has called the attention of the scientific community for the need of improvement in the technologies for detoxification of organophosphorus (OP) compounds. A compelling strategy is the use of bioremediation by enzymes that are able to hydrolyze these molecules to harmless chemical species. Several enzymes have been studied and engineered for this purpose. However, their mechanisms of action are not well understood. Theoretical investigations may help elucidate important aspects of these mechanisms and help in the development of more efficient bio-remediators. In this review, we point out the major contributions of computational methodologies applied to enzyme based detoxification of OPs. Furthermore, we highlight the use of PTE, PON, DFP, and BuChE as enzymes used in OP detoxification process and how computational tools such as molecular docking, molecular dynamics simulations and combined quantum mechanical/molecular mechanics have and will continue to contribute to this very important area of research.

  4. Overhead Crane Computer Model

    NASA Astrophysics Data System (ADS)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  5. 20170312 - Computer Simulation of Developmental ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  6. Computer Simulation of Developmental Processes and ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  7. Protein Dynamics from NMR and Computer Simulation

    NASA Astrophysics Data System (ADS)

    Wu, Qiong; Kravchenko, Olga; Kemple, Marvin; Likic, Vladimir; Klimtchuk, Elena; Prendergast, Franklyn

    2002-03-01

    Proteins exhibit internal motions from the millisecond to sub-nanosecond time scale. The challenge is to relate these internal motions to biological function. A strategy to address this aim is to apply a combination of several techniques including high-resolution NMR, computer simulation of molecular dynamics (MD), molecular graphics, and finally molecular biology, the latter to generate appropriate samples. Two difficulties that arise are: (1) the time scale which is most directly biologically relevant (ms to μs) is not readily accessible by these techniques and (2) the techniques focus on local and not collective motions. We will outline methods using ^13C-NMR to help alleviate the second problem, as applied to intestinal fatty acid binding protein, a relatively small intracellular protein believed to be involved in fatty acid transport and metabolism. This work is supported in part by PHS Grant GM34847 (FGP) and by a fellowship from the American Heart Association (QW).

  8. Protocol for concomitant temporomandibular joint custom-fitted total joint reconstruction and orthognathic surgery utilizing computer-assisted surgical simulation.

    PubMed

    Movahed, Reza; Teschke, Marcus; Wolford, Larry M

    2013-12-01

    Clinicians who address temporomandibular joint (TMJ) pathology and dentofacial deformities surgically can perform the surgery in 1 stage or 2 separate stages. The 2-stage approach requires the patient to undergo 2 separate operations and anesthesia, significantly prolonging the overall treatment. However, performing concomitant TMJ and orthognathic surgery (CTOS) in these cases requires careful treatment planning and surgical proficiency in the 2 surgical areas. This article presents a new treatment protocol for the application of computer-assisted surgical simulation in CTOS cases requiring reconstruction with patient-fitted total joint prostheses. The traditional and new CTOS protocols are described and compared. The new CTOS protocol helps decrease the preoperative workup time and increase the accuracy of model surgery. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  9. A Hybrid Genetic-Simulated Annealing Algorithm for the Location-Inventory-Routing Problem Considering Returns under E-Supply Chain Environment

    PubMed Central

    Guo, Hao; Fu, Jing

    2013-01-01

    Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment. PMID:24489489

  10. Training of perceptual-cognitive skills in offside decision making.

    PubMed

    Catteeuw, Peter; Gilis, Bart; Jaspers, Arne; Wagemans, Johan; Helsen, Werner

    2010-12-01

    This study investigates the effect of two off-field training formats to improve offside decision making. One group trained with video simulations and another with computer animations. Feedback after every offside situation allowed assistant referees to compensate for the consequences of the flash-lag effect and to improve their decision-making accuracy. First, response accuracy improved and flag errors decreased for both training groups implying that training interventions with feedback taught assistant referees to better deal with the flash-lag effect. Second, the results demonstrated no effect of format, although assistant referees rated video simulations higher for fidelity than computer animations. This implies that a cognitive correction to a perceptual effect can be learned also when the format does not correspond closely with the original perceptual situation. Off-field offside decision-making training should be considered as part of training because it is a considerable help to gain more experience and to improve overall decision-making performance.

  11. 2-D and 3-D mixing flow analyses of a scramjet-afterbody configuration

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Eleshaky, Mohamed E.; Engelund, Walter C.

    1989-01-01

    A cold simulant gas study of propulsion/airframe integration for a hypersonic vehicle powered by a scramjet engine is presented. The specific heat ratio of the hot exhaust gases are matched by utilizing a cold mixture of argon and Freon-12. Solutions are obtained for a hypersonic corner flow and a supersonic rectangular flow in order to provide the upstream boundary conditions. The computational test examples also provide a comparison of this flow with that of air as the expanding supersonic jet, where the specific heats are assumed to be constant. It is shown that the three-dimensional computational fluid capabilities developed for these types of flow may be utilized to augment the conventional wind tunnel studies of scramjet afterbody flows using cold simulant exhaust gases, which in turn can help in the design of a scramjet internal-external nozzle.

  12. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  13. Desktop microsimulation: a tool to improve efficiency in the medical office practice.

    PubMed

    Montgomery, James B; Linville, Beth A; Slonim, Anthony D

    2013-01-01

    Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.

  14. Multipolar electrostatics.

    PubMed

    Cardamone, Salvatore; Hughes, Timothy J; Popelier, Paul L A

    2014-06-14

    Atomistic simulation of chemical systems is currently limited by the elementary description of electrostatics that atomic point-charges offer. Unfortunately, a model of one point-charge for each atom fails to capture the anisotropic nature of electronic features such as lone pairs or π-systems. Higher order electrostatic terms, such as those offered by a multipole moment expansion, naturally recover these important electronic features. The question remains as to why such a description has not yet been widely adopted by popular molecular mechanics force fields. There are two widely-held misconceptions about the more rigorous formalism of multipolar electrostatics: (1) Accuracy: the implementation of multipole moments, compared to point-charges, offers little to no advantage in terms of an accurate representation of a system's energetics, structure and dynamics. (2) Efficiency: atomistic simulation using multipole moments is computationally prohibitive compared to simulation using point-charges. Whilst the second of these may have found some basis when computational power was a limiting factor, the first has no theoretical grounding. In the current work, we disprove the two statements above and systematically demonstrate that multipole moments are not discredited by either. We hope that this perspective will help in catalysing the transition to more realistic electrostatic modelling, to be adopted by popular molecular simulation software.

  15. BHR equations re-derived with immiscible particle effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwarzkopf, John Dennis; Horwitz, Jeremy A.

    2015-05-01

    Compressible and variable density turbulent flows with dispersed phase effects are found in many applications ranging from combustion to cloud formation. These types of flows are among the most challenging to simulate. While the exact equations governing a system of particles and fluid are known, computational resources limit the scale and detail that can be simulated in this type of problem. Therefore, a common method is to simulate averaged versions of the flow equations, which still capture salient physics and is relatively less computationally expensive. Besnard developed such a model for variable density miscible turbulence, where ensemble-averaging was applied tomore » the flow equations to yield a set of filtered equations. Besnard further derived transport equations for the Reynolds stresses, the turbulent mass flux, and the density-specific volume covariance, to help close the filtered momentum and continuity equations. We re-derive the exact BHR closure equations which include integral terms owing to immiscible effects. Physical interpretations of the additional terms are proposed along with simple models. The goal of this work is to extend the BHR model to allow for the simulation of turbulent flows where an immiscible dispersed phase is non-trivially coupled with the carrier phase.« less

  16. Optimally analyzing and implementing of bolt fittings in steel structure based on ANSYS

    NASA Astrophysics Data System (ADS)

    Han, Na; Song, Shuangyang; Cui, Yan; Wu, Yongchun

    2018-03-01

    ANSYS simulation software for its excellent performance become outstanding one in Computer-aided Engineering (CAE) family, it is committed to the innovation of engineering simulation to help users to shorten the design process. First, a typical procedure to implement CAE was design. The framework of structural numerical analysis on ANSYS Technology was proposed. Then, A optimally analyzing and implementing of bolt fittings in beam-column join of steel structure was implemented by ANSYS, which was display the cloud chart of XY-shear stress, the cloud chart of YZ-shear stress and the cloud chart of Y component of stress. Finally, ANSYS software simulating results was compared with the measured results by the experiment. The result of ANSYS simulating and analyzing is reliable, efficient and optical. In above process, a structural performance's numerical simulating and analyzing model were explored for engineering enterprises' practice.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Benjamin S.; Hamilton, Steven P.; Jarrett, Michael G.

    This report describes the performance improvements made to the VERA Core Simulator (VERA-CS) during FY2016. The development of the VERA Core Simulator has focused on the capability needed to deplete physical reactors and help solve various problems; this capability required the accurate simulation of many operating cycles of a nuclear power plant. The first section of this report introduces two test problems used to assess the run-time performance of VERA-CS using a source dated February 2016. The next section provides a brief overview of the major modifications made to decrease the computational cost. Following the descriptions of the major improvements,more » the run-time for each improvement is shown. Conclusions on the work are presented, and further follow-on performance improvements are suggested.« less

  18. Collaborative voxel-based surgical virtual environments.

    PubMed

    Acosta, Eric; Muniz, Gilbert; Armonda, Rocco; Bowyer, Mark; Liu, Alan

    2008-01-01

    Virtual Reality-based surgical simulators can utilize Collaborative Virtual Environments (C-VEs) to provide team-based training. To support real-time interactions, C-VEs are typically replicated on each user's local computer and a synchronization method helps keep all local copies consistent. This approach does not work well for voxel-based C-VEs since large and frequent volumetric updates make synchronization difficult. This paper describes a method that allows multiple users to interact within a voxel-based C-VE for a craniotomy simulator being developed. Our C-VE method requires smaller update sizes and provides faster synchronization update rates than volumetric-based methods. Additionally, we address network bandwidth/latency issues to simulate networked haptic and bone drilling tool interactions with a voxel-based skull C-VE.

  19. Single-cell-based computer simulation of the oxygen-dependent tumour response to irradiation

    NASA Astrophysics Data System (ADS)

    Harting, Christine; Peschke, Peter; Borkenstein, Klaus; Karger, Christian P.

    2007-08-01

    Optimization of treatment plans in radiotherapy requires the knowledge of tumour control probability (TCP) and normal tissue complication probability (NTCP). Mathematical models may help to obtain quantitative estimates of TCP and NTCP. A single-cell-based computer simulation model is presented, which simulates tumour growth and radiation response on the basis of the response of the constituting cells. The model contains oxic, hypoxic and necrotic tumour cells as well as capillary cells which are considered as sources of a radial oxygen profile. Survival of tumour cells is calculated by the linear quadratic model including the modified response due to the local oxygen concentration. The model additionally includes cell proliferation, hypoxia-induced angiogenesis, apoptosis and resorption of inactivated tumour cells. By selecting different degrees of angiogenesis, the model allows the simulation of oxic as well as hypoxic tumours having distinctly different oxygen distributions. The simulation model showed that poorly oxygenated tumours exhibit an increased radiation tolerance. Inter-tumoural variation of radiosensitivity flattens the dose response curve. This effect is enhanced by proliferation between fractions. Intra-tumoural radiosensitivity variation does not play a significant role. The model may contribute to the mechanistic understanding of the influence of biological tumour parameters on TCP. It can in principle be validated in radiation experiments with experimental tumours.

  20. Shock Simulations of Single-Site Coarse-Grain RDX using the Dissipative Particle Dynamics Method with Reactivity

    NASA Astrophysics Data System (ADS)

    Sellers, Michael; Lisal, Martin; Schweigert, Igor; Larentzos, James; Brennan, John

    2015-06-01

    In discrete particle simulations, when an atomistic model is coarse-grained, a trade-off is made: a boost in computational speed for a reduction in accuracy. Dissipative Particle Dynamics (DPD) methods help to recover accuracy in viscous and thermal properties, while giving back a small amount of computational speed. One of the most notable extensions of DPD has been the introduction of chemical reactivity, called DPD-RX. Today, pairing the current evolution of DPD-RX with a coarse-grained potential and its chemical decomposition reactions allows for the simulation of the shock behavior of energetic materials at a timescale faster than an atomistic counterpart. In 2007, Maillet et al. introduced implicit chemical reactivity in DPD through the concept of particle reactors and simulated the decomposition of liquid nitromethane. We have recently extended the DPD-RX method and have applied it to solid hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) under shock conditions using a recently developed single-site coarse-grain model and a reduced RDX decomposition mechanism. A description of the methods used to simulate RDX and its tranition to hot product gases within DPD-RX will be presented. Additionally, examples of the effect of microstructure on shock behavior will be shown. Approved for public release. Distribution is unlimited.

  1. Application of the finite-element method and the eigenmode expansion method to investigate the periodic and spectral characteristic of discrete phase-shift fiber Bragg grating

    NASA Astrophysics Data System (ADS)

    He, Yue-Jing; Hung, Wei-Chih; Syu, Cheng-Jyun

    2017-12-01

    The finite-element method (FEM) and eigenmode expansion method (EEM) were adopted to analyze the guided modes and spectrum of phase-shift fiber Bragg grating at five phase-shift degrees (including zero, 1/4π, 1/2π, 3/4π, and π). In previous studies on optical fiber grating, conventional coupled-mode theory was crucial. This theory contains abstruse knowledge about physics and complex computational processes, and thus is challenging for users. Therefore, a numerical simulation method was coupled with a simple and rigorous design procedure to help beginners and users to overcome difficulty in entering the field; in addition, graphical simulation results were presented. To reduce the difference between the simulated context and the actual context, a perfectly matched layer and perfectly reflecting boundary were added to the FEM and the EEM. When the FEM was used for grid cutting, the object meshing method and the boundary meshing method proposed in this study were used to effectively enhance computational accuracy and substantially reduce the time required for simulation. In summary, users can use the simulation results in this study to easily and rapidly design an optical fiber communication system and optical sensors with spectral characteristics.

  2. Simulation and animation of sensor-driven robots.

    PubMed

    Chen, C; Trivedi, M M; Bidlack, C R

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.

  3. Fluid management technology: Liquid slosh dynamics and control

    NASA Technical Reports Server (NTRS)

    Dodge, Franklin T.; Green, Steven T.; Kana, Daniel D.

    1991-01-01

    Flight experiments were defined for the Cryogenic On-Orbit Liquid Depot Storage, Acquisition and Transfer Satellite (COLD-SAT) test bed satellite and the Shuttle middeck to help establish the influence of the gravitational environment on liquid slosh dynamics and control. Several analytical and experimental studies were also conducted to support the experiments and to help understand the anticipated results. Both FLOW-3D and NASA-VOF3D computer codes were utilized to simulate low Bond number, small amplitude sloshing, for which the motions are dominated by surface forces; it was found that neither code provided a satisfactory simulation. Thus, a new analysis of low Bond number sloshing was formulated, using an integral minimization technique that will allow the assumptions made about surface physics phenomena to be modified easily when better knowledge becomes available from flight experiments. Several examples were computed by the innovative use of a finite-element structural code. An existing spherical-pendulum analogy of nonlinear, rotary sloshing was also modified for easier use and extended to low-gravity conditions. Laboratory experiments were conducted to determine the requirements for liquid-vapor interface sensors as a method of resolving liquid surface motions in flight experiments. The feasibility of measuring the small slosh forces anticipated in flight experiments was also investigated.

  4. Fluid management technology: Liquid slosh dynamics and control

    NASA Astrophysics Data System (ADS)

    Dodge, Franklin T.; Green, Steven T.; Kana, Daniel D.

    1991-11-01

    Flight experiments were defined for the Cryogenic On-Orbit Liquid Depot Storage, Acquisition and Transfer Satellite (COLD-SAT) test bed satellite and the Shuttle middeck to help establish the influence of the gravitational environment on liquid slosh dynamics and control. Several analytical and experimental studies were also conducted to support the experiments and to help understand the anticipated results. Both FLOW-3D and NASA-VOF3D computer codes were utilized to simulate low Bond number, small amplitude sloshing, for which the motions are dominated by surface forces; it was found that neither code provided a satisfactory simulation. Thus, a new analysis of low Bond number sloshing was formulated, using an integral minimization technique that will allow the assumptions made about surface physics phenomena to be modified easily when better knowledge becomes available from flight experiments. Several examples were computed by the innovative use of a finite-element structural code. An existing spherical-pendulum analogy of nonlinear, rotary sloshing was also modified for easier use and extended to low-gravity conditions. Laboratory experiments were conducted to determine the requirements for liquid-vapor interface sensors as a method of resolving liquid surface motions in flight experiments. The feasibility of measuring the small slosh forces anticipated in flight experiments was also investigated.

  5. Intravenous catheter training system: computer-based education versus traditional learning methods.

    PubMed

    Engum, Scott A; Jeffries, Pamela; Fisher, Lisa

    2003-07-01

    Virtual reality simulators allow trainees to practice techniques without consequences, reduce potential risk associated with training, minimize animal use, and help to develop standards and optimize procedures. Current intravenous (IV) catheter placement training methods utilize plastic arms, however, the lack of variability can diminish the educational stimulus for the student. This study compares the effectiveness of an interactive, multimedia, virtual reality computer IV catheter simulator with a traditional laboratory experience of teaching IV venipuncture skills to both nursing and medical students. A randomized, pretest-posttest experimental design was employed. A total of 163 participants, 70 baccalaureate nursing students and 93 third-year medical students beginning their fundamental skills training were recruited. The students ranged in age from 20 to 55 years (mean 25). Fifty-eight percent were female and 68% percent perceived themselves as having average computer skills (25% declaring excellence). The methods of IV catheter education compared included a traditional method of instruction involving a scripted self-study module which involved a 10-minute videotape, instructor demonstration, and hands-on-experience using plastic mannequin arms. The second method involved an interactive multimedia, commercially made computer catheter simulator program utilizing virtual reality (CathSim). The pretest scores were similar between the computer and the traditional laboratory group. There was a significant improvement in cognitive gains, student satisfaction, and documentation of the procedure with the traditional laboratory group compared with the computer catheter simulator group. Both groups were similar in their ability to demonstrate the skill correctly. CONCLUSIONS; This evaluation and assessment was an initial effort to assess new teaching methodologies related to intravenous catheter placement and their effects on student learning outcomes and behaviors. Technology alone is not a solution for stand alone IV catheter placement education. A traditional learning method was preferred by students. The combination of these two methods of education may further enhance the trainee's satisfaction and skill acquisition level.

  6. Reproducibility in Computational Neuroscience Models and Simulations

    PubMed Central

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  7. AQMAN; linear and quadratic programming matrix generator using two-dimensional ground-water flow simulation for aquifer management modeling

    USGS Publications Warehouse

    Lefkoff, L.J.; Gorelick, S.M.

    1987-01-01

    A FORTRAN-77 computer program code that helps solve a variety of aquifer management problems involving the control of groundwater hydraulics. It is intended for use with any standard mathematical programming package that uses Mathematical Programming System input format. The computer program creates the input files to be used by the optimization program. These files contain all the hydrologic information and management objectives needed to solve the management problem. Used in conjunction with a mathematical programming code, the computer program identifies the pumping or recharge strategy that achieves a user 's management objective while maintaining groundwater hydraulic conditions within desired limits. The objective may be linear or quadratic, and may involve the minimization of pumping and recharge rates or of variable pumping costs. The problem may contain constraints on groundwater heads, gradients, and velocities for a complex, transient hydrologic system. Linear superposition of solutions to the transient, two-dimensional groundwater flow equation is used by the computer program in conjunction with the response matrix optimization method. A unit stress is applied at each decision well and transient responses at all control locations are computed using a modified version of the U.S. Geological Survey two dimensional aquifer simulation model. The program also computes discounted cost coefficients for the objective function and accounts for transient aquifer conditions. (Author 's abstract)

  8. Topographica: Building and Analyzing Map-Level Simulations from Python, C/C++, MATLAB, NEST, or NEURON Components

    PubMed Central

    Bednar, James A.

    2008-01-01

    Many neural regions are arranged into two-dimensional topographic maps, such as the retinotopic maps in mammalian visual cortex. Computational simulations have led to valuable insights about how cortical topography develops and functions, but further progress has been hindered by the lack of appropriate tools. It has been particularly difficult to bridge across levels of detail, because simulators are typically geared to a specific level, while interfacing between simulators has been a major technical challenge. In this paper, we show that the Python-based Topographica simulator makes it straightforward to build systems that cross levels of analysis, as well as providing a common framework for evaluating and comparing models implemented in other simulators. These results rely on the general-purpose abstractions around which Topographica is designed, along with the Python interfaces becoming available for many simulators. In particular, we present a detailed, general-purpose example of how to wrap an external spiking PyNN/NEST simulation as a Topographica component using only a dozen lines of Python code, making it possible to use any of the extensive input presentation, analysis, and plotting tools of Topographica. Additional examples show how to interface easily with models in other types of simulators. Researchers simulating topographic maps externally should consider using Topographica's analysis tools (such as preference map, receptive field, or tuning curve measurement) to compare results consistently, and for connecting models at different levels. This seamless interoperability will help neuroscientists and computational scientists to work together to understand how neurons in topographic maps organize and operate. PMID:19352443

  9. Retinal Image Simulation of Subjective Refraction Techniques.

    PubMed

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  10. A new strategic neurosurgical planning tool for brainstem cavernous malformations using interactive computer graphics with multimodal fusion images.

    PubMed

    Kin, Taichi; Nakatomi, Hirofumi; Shojima, Masaaki; Tanaka, Minoru; Ino, Kenji; Mori, Harushi; Kunimatsu, Akira; Oyama, Hiroshi; Saito, Nobuhito

    2012-07-01

    In this study, the authors used preoperative simulation employing 3D computer graphics (interactive computer graphics) to fuse all imaging data for brainstem cavernous malformations. The authors evaluated whether interactive computer graphics or 2D imaging correlated better with the actual operative field, particularly in identifying a developmental venous anomaly (DVA). The study population consisted of 10 patients scheduled for surgical treatment of brainstem cavernous malformations. Data from preoperative imaging (MRI, CT, and 3D rotational angiography) were automatically fused using a normalized mutual information method, and then reconstructed by a hybrid method combining surface rendering and volume rendering methods. With surface rendering, multimodality and multithreshold techniques for 1 tissue were applied. The completed interactive computer graphics were used for simulation of surgical approaches and assumed surgical fields. Preoperative diagnostic rates for a DVA associated with brainstem cavernous malformation were compared between conventional 2D imaging and interactive computer graphics employing receiver operating characteristic (ROC) analysis. The time required for reconstruction of 3D images was 3-6 hours for interactive computer graphics. Observation in interactive mode required approximately 15 minutes. Detailed anatomical information for operative procedures, from the craniotomy to microsurgical operations, could be visualized and simulated three-dimensionally as 1 computer graphic using interactive computer graphics. Virtual surgical views were consistent with actual operative views. This technique was very useful for examining various surgical approaches. Mean (±SEM) area under the ROC curve for rate of DVA diagnosis was significantly better for interactive computer graphics (1.000±0.000) than for 2D imaging (0.766±0.091; p<0.001, Mann-Whitney U-test). The authors report a new method for automatic registration of preoperative imaging data from CT, MRI, and 3D rotational angiography for reconstruction into 1 computer graphic. The diagnostic rate of DVA associated with brainstem cavernous malformation was significantly better using interactive computer graphics than with 2D images. Interactive computer graphics was also useful in helping to plan the surgical access corridor.

  11. Exergy analysis of helium liquefaction systems based on modified Claude cycle with two-expanders

    NASA Astrophysics Data System (ADS)

    Thomas, Rijo Jacob; Ghosh, Parthasarathi; Chowdhury, Kanchan

    2011-06-01

    Large-scale helium liquefaction systems, being energy-intensive, demand judicious selection of process parameters. An effective tool for design and analysis of thermodynamic cycles for these systems is exergy analysis, which is used to study the behavior of a helium liquefaction system based on modified Claude cycle. Parametric evaluation using process simulator Aspen HYSYS® helps to identify the effects of cycle pressure ratio and expander flow fraction on the exergetic efficiency of the liquefaction cycle. The study computes the distribution of losses at different refrigeration stages of the cycle and helps in selecting optimum cycle pressures, operating temperature levels of expanders and mass flow rates through them. Results from the analysis may help evolving guidelines for designing appropriate thermodynamic cycles for practical helium liquefaction systems.

  12. A hybrid method combining the surface integral equation method and ray tracing for the numerical simulation of high frequency diffraction involved in ultrasonic NDT

    NASA Astrophysics Data System (ADS)

    Bonnet, M.; Collino, F.; Demaldent, E.; Imperiale, A.; Pesudo, L.

    2018-05-01

    Ultrasonic Non-Destructive Testing (US NDT) has become widely used in various fields of applications to probe media. Exploiting the surface measurements of the ultrasonic incident waves echoes after their propagation through the medium, it allows to detect potential defects (cracks and inhomogeneities) and characterize the medium. The understanding and interpretation of those experimental measurements is performed with the help of numerical modeling and simulations. However, classical numerical methods can become computationally very expensive for the simulation of wave propagation in the high frequency regime. On the other hand, asymptotic techniques are better suited to model high frequency scattering over large distances but nevertheless do not allow accurate simulation of complex diffraction phenomena. Thus, neither numerical nor asymptotic methods can individually solve high frequency diffraction problems in large media, as those involved in UNDT controls, both quickly and accurately, but their advantages and limitations are complementary. Here we propose a hybrid strategy coupling the surface integral equation method and the ray tracing method to simulate high frequency diffraction under speed and accuracy constraints. This strategy is general and applicable to simulate diffraction phenomena in acoustic or elastodynamic media. We provide its implementation and investigate its performances for the 2D acoustic diffraction problem. The main features of this hybrid method are described and results of 2D computational experiments discussed.

  13. Tools and procedures for visualization of proteins and other biomolecules.

    PubMed

    Pan, Lurong; Aller, Stephen G

    2015-04-01

    Protein, peptides, and nucleic acids are biomolecules that drive biological processes in living organisms. An enormous amount of structural data for a large number of these biomolecules has been described with atomic precision in the form of structural "snapshots" that are freely available in public repositories. These snapshots can help explain how the biomolecules function, the nature of interactions between multi-molecular complexes, and even how small-molecule drugs can modulate the biomolecules for clinical benefits. Furthermore, these structural snapshots serve as inputs for sophisticated computer simulations to turn the biomolecules into moving, "breathing" molecular machines for understanding their dynamic properties in real-time computer simulations. In order for the researcher to take advantage of such a wealth of structural data, it is necessary to gain competency in the use of computer molecular visualization tools for exploring the structures and visualizing three-dimensional spatial representations. Here, we present protocols for using two common visualization tools--the Web-based Jmol and the stand-alone PyMOL package--as well as a few examples of other popular tools. Copyright © 2015 John Wiley & Sons, Inc.

  14. Resilient workflows for computational mechanics platforms

    NASA Astrophysics Data System (ADS)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  15. Big Data Processing for a Central Texas Groundwater Case Study

    NASA Astrophysics Data System (ADS)

    Cantu, A.; Rivera, O.; Martínez, A.; Lewis, D. H.; Gentle, J. N., Jr.; Fuentes, G.; Pierce, S. A.

    2016-12-01

    As computational methods improve, scientists are able to expand the level and scale of experimental simulation and testing that is completed for case studies. This study presents a comparative analysis of multiple models for the Barton Springs segment of the Edwards aquifer. Several numerical simulations using state-mandated MODFLOW models ran on Stampede, a High Performance Computing system housed at the Texas Advanced Computing Center, were performed for multiple scenario testing. One goal of this multidisciplinary project aims to visualize and compare the output data of the groundwater model using the statistical programming language R to find revealing data patterns produced by different pumping scenarios. Presenting data in a friendly post-processing format is covered in this paper. Visualization of the data and creating workflows applicable to the management of the data are tasks performed after data extraction. Resulting analyses provide an example of how supercomputing can be used to accelerate evaluation of scientific uncertainty and geological knowledge in relation to policy and management decisions. Understanding the aquifer behavior helps policy makers avoid negative impact on the endangered species, environmental services and aids in maximizing the aquifer yield.

  16. Ant system: optimization by a colony of cooperating agents.

    PubMed

    Dorigo, M; Maniezzo, V; Colorni, A

    1996-01-01

    An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call ant system (AS). We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed methodology to the classical traveling salesman problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling. Finally we discuss the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.

  17. Impacts of Fluid Dynamics Simulation in Study of Nasal Airflow Physiology and Pathophysiology in Realistic Human Three-Dimensional Nose Models

    PubMed Central

    Lee, Heow Peuh; Gordon, Bruce R.

    2012-01-01

    During the past decades, numerous computational fluid dynamics (CFD) studies, constructed from CT or MRI images, have simulated human nasal models. As compared to rhinomanometry and acoustic rhinometry, which provide quantitative information only of nasal airflow, resistance, and cross sectional areas, CFD enables additional measurements of airflow passing through the nasal cavity that help visualize the physiologic impact of alterations in intranasal structures. Therefore, it becomes possible to quantitatively measure, and visually appreciate, the airflow pattern (laminar or turbulent), velocity, pressure, wall shear stress, particle deposition, and temperature changes at different flow rates, in different parts of the nasal cavity. The effects of both existing anatomical factors, as well as post-operative changes, can be assessed. With recent improvements in CFD technology and computing power, there is a promising future for CFD to become a useful tool in planning, predicting, and evaluating outcomes of nasal surgery. This review discusses the possibilities and potential impacts, as well as technical limitations, of using CFD simulation to better understand nasal airflow physiology. PMID:23205221

  18. A Prototyping Effort for the Integrated Spacecraft Analysis System

    NASA Technical Reports Server (NTRS)

    Wong, Raymond; Tung, Yu-Wen; Maldague, Pierre

    2011-01-01

    Computer modeling and simulation has recently become an essential technique for predicting and validating spacecraft performance. However, most computer models only examine spacecraft subsystems, and the independent nature of the models creates integration problems, which lowers the possibilities of simulating a spacecraft as an integrated unit despite a desire for this type of analysis. A new project called Integrated Spacecraft Analysis was proposed to serve as a framework for an integrated simulation environment. The project is still in its infancy, but a software prototype would help future developers assess design issues. The prototype explores a service oriented design paradigm that theoretically allows programs written in different languages to communicate with one another. It includes creating a uniform interface to the SPICE libraries such that different in-house tools like APGEN or SEQGEN can exchange information with it without much change. Service orientation may result in a slower system as compared to a single application, and more research needs to be done on the different available technologies, but a service oriented approach could increase long term maintainability and extensibility.

  19. High-Fidelity Computational Aerodynamics of Multi-Rotor Unmanned Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    Ventura Diaz, Patricia; Yoon, Seokkwan

    2018-01-01

    High-fidelity Computational Fluid Dynamics (CFD) simulations have been carried out for several multi-rotor Unmanned Aerial Vehicles (UAVs). Three vehicles have been studied: the classic quadcopter DJI Phantom 3, an unconventional quadcopter specialized for forward flight, the SUI Endurance, and an innovative concept for Urban Air Mobility (UAM), the Elytron 4S UAV. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, and a hybrid turbulence model. The DJI Phantom 3 is simulated with different rotors and with both a simplified airframe and the real airframe including landing gear and a camera. The effects of weather are studied for the DJI Phantom 3 quadcopter in hover. The SUI En- durance original design is compared in forward flight to a new configuration conceived by the authors, the hybrid configuration, which gives a large improvement in forward thrust. The Elytron 4S UAV is simulated in helicopter mode and in airplane mode. Understanding the complex flows in multi-rotor vehicles will help design quieter, safer, and more efficient future drones and UAM vehicles.

  20. Post-processing interstitialcy diffusion from molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Bhardwaj, U.; Bukkuru, S.; Warrier, M.

    2016-01-01

    An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures is studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms.

  1. Post-processing interstitialcy diffusion from molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhardwaj, U., E-mail: haptork@gmail.com; Bukkuru, S.; Warrier, M.

    2016-01-15

    An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures ismore » studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms. -- Graphical abstract:.« less

  2. A qualitatively validated mathematical-computational model of the immune response to the yellow fever vaccine.

    PubMed

    Bonin, Carla R B; Fernandes, Guilherme C; Dos Santos, Rodrigo W; Lobosco, Marcelo

    2018-05-25

    Although a safe and effective yellow fever vaccine was developed more than 80 years ago, several issues regarding its use remain unclear. For example, what is the minimum dose that can provide immunity against the disease? A useful tool that can help researchers answer this and other related questions is a computational simulator that implements a mathematical model describing the human immune response to vaccination against yellow fever. This work uses a system of ten ordinary differential equations to represent a few important populations in the response process generated by the body after vaccination. The main populations include viruses, APCs, CD8+ T cells, short-lived and long-lived plasma cells, B cells and antibodies. In order to qualitatively validate our model, four experiments were carried out, and their computational results were compared to experimental data obtained from the literature. The four experiments were: a) simulation of a scenario in which an individual was vaccinated against yellow fever for the first time; b) simulation of a booster dose ten years after the first dose; c) simulation of the immune response to the yellow fever vaccine in individuals with different levels of naïve CD8+ T cells; and d) simulation of the immune response to distinct doses of the yellow fever vaccine. This work shows that the simulator was able to qualitatively reproduce some of the experimental results reported in the literature, such as the amount of antibodies and viremia throughout time, as well as to reproduce other behaviors of the immune response reported in the literature, such as those that occur after a booster dose of the vaccine.

  3. Simulation of Turbine Tone Noise Generation Using a Turbomachinery Aerodynamics Solver

    NASA Technical Reports Server (NTRS)

    VanZante, Dale; Envia, Edmane

    2010-01-01

    As turbofan engine bypass ratios continue to increase, the contribution of the turbine to the engine noise signature is receiving more attention. Understanding the relative importance of the various turbine noise generation mechanisms and the characteristics of the turbine acoustic transmission loss are essential ingredients in developing robust reduced-order models for predicting the turbine noise signature. A computationally based investigation has been undertaken to help guide the development of a turbine noise prediction capability that does not rely on empiricism. As proof-of-concept for this approach, two highly detailed numerical simulations of the unsteady flow field inside the first stage of a modern high-pressure turbine were carried out. The simulations were computed using TURBO, which is an unsteady Reynolds-Averaged Navier-Stokes code capable of multi-stage simulations. Spectral and modal analysis of the unsteady pressure data from the numerical simulation of the turbine stage show a circumferential modal distribution that is consistent with the Tyler-Sofrin rule. Within the high-pressure turbine, the interaction of velocity, pressure and temperature fluctuations with the downstream blade rows are all possible tone noise source mechanisms. We have taken the initial step in determining the source strength hierarchy by artificially reducing the level of temperature fluctuations in the turbine flowfield. This was accomplished by changing the vane cooling flow temperature in order to mitigate the vane thermal wake in the second of the two simulations. The results indicated that, despite a dramatic change in the vane cooling flow, the computed modal levels changed very little indicating that the contribution of temperature fluctuations to the overall pressure field is rather small compared with the viscous and potential field interaction mechanisms.

  4. Understanding resonance graphs using Easy Java Simulations (EJS) and why we use EJS

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Lee, Tat Leong; Chew, Charles; Wong, Darren; Tan, Samuel

    2015-03-01

    This paper reports a computer model simulation created using Easy Java Simulation (EJS) for learners to visualize how the steady-state amplitude of a driven oscillating system varies with the frequency of the periodic driving force. The simulation shows (N = 100) identical spring-mass systems being subjected to (1) a periodic driving force of equal amplitude but different driving frequencies, and (2) different amounts of damping. The simulation aims to create a visually intuitive way of understanding how the series of amplitude versus driving frequency graphs are obtained by showing how the displacement of the system changes over time as it transits from the transient to the steady state. A suggested ‘how to use’ the model is added to help educators and students in their teaching and learning, where we explain the theoretical steady-state equation time conditions when the model begins to allow data recording of maximum amplitudes to closely match the theoretical equation, and the steps to collect different runs of the degree of damping. We also discuss two of the design features in our computer model: displaying the instantaneous oscillation together with the achieved steady-state amplitudes, and the explicit world view overlay with scientific representation with different degrees of damping runs. Three advantages of using EJS include: (1) open source codes and creative commons attribution licenses for scaling up of interactively engaging educational practices; (2) the models made can run on almost any device, including Android and iOS; and (3) it allows the redefinition of physics educational practices through computer modeling.

  5. Plasma Science and Innovation Center (PSI-Center) at Washington, Wisconsin, and Utah State, ARRA Supplement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sovinec, Carl

    The objective of the Plasma Science and Innovation Center (PSI-Center) is to develop and deploy computational models that simulate conditions in smaller, concept-exploration plasma experiments. The PSIC group at the University of Wisconsin-Madison, led by Prof. Carl Sovinec, uses and enhances the Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion (NIMROD) code, to simulate macroscopic plasma dynamics in a number of magnetic confinement configurations. These numerical simulations provide information on how magnetic fields and plasma flows evolve over all three spatial dimensions, which supplements the limited access of diagnostics in plasma experiments. The information gained from simulation helps explain how plasma evolves.more » It is also used to engineer more effective plasma confinement systems, reducing the need for building many experiments to cover the physical parameter space. The ultimate benefit is a more cost-effective approach to the development of fusion energy for peaceful power production. The supplemental funds provided by the American Recovery and Reinvestment Act of 2009 were used to purchase computer components that were assembled into a 48-core system with 256 Gb of shared memory. The system was engineered and constructed by the group's system administrator at the time, Anthony Hammond. It was successfully used by then graduate student, Dr. John O'Bryan, for computing magnetic relaxation dynamics that occur during experimental tests of non-inductive startup in the Pegasus Toroidal Experiment (pegasus.ep.wisc.edu). Dr. O'Bryan's simulations provided the first detailed explanation of how the driven helical filament of electrical current evolves into a toroidal tokamak-like plasma configuration.« less

  6. Computational biology in the cloud: methods and new insights from computing at scale.

    PubMed

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    Emerging fossil energy power generation systems must operate with unprecedented efficiency and near-zero emissions, while optimizing profitably amid cost fluctuations for raw materials, finished products, and energy. To help address these challenges, the fossil energy industry will have to rely increasingly on the use advanced computational tools for modeling and simulating complex process systems. In this paper, we present the computational research challenges and opportunities for the optimization of fossil energy power generation systems across the plant lifecycle from process synthesis and design to plant operations. We also look beyond the plant gates to discuss research challenges and opportunities formore » enterprise-wide optimization, including planning, scheduling, and supply chain technologies.« less

  8. The focal plane reception pattern calculation for a paraboloidal antenna with a nearby fence

    NASA Technical Reports Server (NTRS)

    Schmidt, Richard F.; Cheng, Hwai-Soon; Kao, Michael W.

    1987-01-01

    A computer simulation program is described which is used to estimate the effects of a proximate diffraction fence on the performance of paraboloid antennas. The computer program is written in FORTRAN. The physical problem, mathematical formulation and coordinate references are described. The main control structure of the program and the function of the individual subroutines are discussed. The Job Control Language set-up and program instruction are provided in the user's instruction to help users execute the present program. A sample problem with an appropriate output listing is made available as an illustration of the usage of the program.

  9. Computation of shear viscosity of colloidal suspensions by SRD-MD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laganapan, A. M. K.; Videcoq, A., E-mail: arnaud.videcoq@unilim.fr; Bienia, M.

    2015-04-14

    The behaviour of sheared colloidal suspensions with full hydrodynamic interactions (HIs) is numerically studied. To this end, we use the hybrid stochastic rotation dynamics-molecular dynamics (SRD-MD) method. The shear viscosity of colloidal suspensions is computed for different volume fractions, both for dilute and concentrated cases. We verify that HIs help in the collisions and the streaming of colloidal particles, thereby increasing the overall shear viscosity of the suspension. Our results show a good agreement with known experimental, theoretical, and numerical studies. This work demonstrates the ability of SRD-MD to successfully simulate transport coefficients that require correct modelling of HIs.

  10. Computer investigations of the turbulent flow around a NACA2415 airfoil wind turbine

    NASA Astrophysics Data System (ADS)

    Driss, Zied; Chelbi, Tarek; Abid, Mohamed Salah

    2015-12-01

    In this work, computer investigations are carried out to study the flow field developing around a NACA2415 airfoil wind turbine. The Navier-Stokes equations in conjunction with the standard k-ɛ turbulence model are considered. These equations are solved numerically to determine the local characteristics of the flow. The models tested are implemented in the software "SolidWorks Flow Simulation" which uses a finite volume scheme. The numerical results are compared with experiments conducted on an open wind tunnel to validate the numerical results. This will help improving the aerodynamic efficiency in the design of packaged installations of the NACA2415 airfoil type wind turbine.

  11. The System of Simulation and Multi-objective Optimization for the Roller Kiln

    NASA Astrophysics Data System (ADS)

    Huang, He; Chen, Xishen; Li, Wugang; Li, Zhuoqiu

    It is somewhat a difficult researching problem, to get the building parameters of the ceramic roller kiln simulation model. A system integrated of evolutionary algorithms (PSO, DE and DEPSO) and computational fluid dynamics (CFD), is proposed to solve the problem. And the temperature field uniformity and the environment disruption are studied in this paper. With the help of the efficient parallel calculation, the ceramic roller kiln temperature field uniformity and the NOx emissions field have been researched in the system at the same time. A multi-objective optimization example of the industrial roller kiln proves that the system is of excellent parameter exploration capability.

  12. Simulation of forming a flat forging

    NASA Astrophysics Data System (ADS)

    Solomonov, K.; Tishchuk, L.; Fedorinin, N.

    2017-11-01

    The metal flow in some of the metal shaping processes (rolling, pressing, die forging) is subjected to the regularities which determine the scheme of deformation in the metal samples upsetting. The object of the study was the research of the metal flow picture including the contour of the part, the demarcation lines of the metal flow and the flow lines. We have created an algorithm for constructing the metal flow picture, which is based on the representation of the metal flow demarcation line as an equidistant. Computer and physical simulation of the metal flow picture with the help of various software systems confirms the suggested hypothesis.

  13. Neuronal excitability level transition induced by electrical stimulation

    NASA Astrophysics Data System (ADS)

    Florence, G.; Kurths, J.; Machado, B. S.; Fonoff, E. T.; Cerdeira, H. A.; Teixeira, M. J.; Sameshima, K.

    2014-12-01

    In experimental studies, electrical stimulation (ES) has been applied to induce neuronal activity or to disrupt pathological patterns. Nevertheless, the underlying mechanisms of these activity pattern transitions are not clear. To study these phenomena, we simulated a model of the hippocampal region CA1. The computational simulations using different amplitude levels and duration of ES revealed three states of neuronal excitability: burst-firing mode, depolarization block and spreading depression wave. We used the bifurcation theory to analyse the interference of ES in the cellular excitability and the neuronal dynamics. Understanding this process would help to improve the ES techniques to control some neurological disorders.

  14. Reducing Threshold of Multi Quantum Wells InGaN Laser Diode by Using InGaN/GaN Waveguide

    NASA Astrophysics Data System (ADS)

    Abdullah, Rafid A.; Ibrahim, Kamarulazizi

    2010-07-01

    ISE TCAD (Integrated System Engineering Technology Computer Aided Design) software simulation program has been utilized to help study the effect of using InGaN/GaN as a waveguide instead of conventional GaN waveguide for multi quantum wells violet InGaN laser diode (LD). Simulation results indicate that the threshold of the LD has been reduced by using InGaN/GaN waveguide where InGaN/GaN waveguide increases the optical confinement factor which leads to increase the confinement carriers at the active region of the LD.

  15. Preoptimised VB: a fast method for the ground and excited states of ionic clusters I. Localised preoptimisation for (ArCO) +, (ArN 2) + and N 4+

    NASA Astrophysics Data System (ADS)

    Langenberg, J. H.; Bucur, I. B.; Archirel, P.

    1997-09-01

    We show that in the simple case of van der Waals ionic clusters, the optimisation of orbitals within VB can be easily simulated with the help of pseudopotentials. The procedure yields the ground and the first excited states of the cluster simultaneously. This makes the calculation of potential energy surfaces for tri- and tetraatomic clusters possible, with very acceptable computation times. We give potential curves for (ArCO) +, (ArN 2) + and N 4+. An application to the simulation of the SCF method is shown for Na +H 2O.

  16. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  17. Spontaneous Ad Hoc Mobile Cloud Computing Network

    PubMed Central

    Lacuesta, Raquel; Sendra, Sandra; Peñalver, Lourdes

    2014-01-01

    Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes. PMID:25202715

  18. Spontaneous ad hoc mobile cloud computing network.

    PubMed

    Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes

    2014-01-01

    Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.

  19. A Development of Lightweight Grid Interface

    NASA Astrophysics Data System (ADS)

    Iwai, G.; Kawai, Y.; Sasaki, T.; Watase, Y.

    2011-12-01

    In order to help a rapid development of Grid/Cloud aware applications, we have developed API to abstract the distributed computing infrastructures based on SAGA (A Simple API for Grid Applications). SAGA, which is standardized in the OGF (Open Grid Forum), defines API specifications to access distributed computing infrastructures, such as Grid, Cloud and local computing resources. The Universal Grid API (UGAPI), which is a set of command line interfaces (CLI) and APIs, aims to offer simpler API to combine several SAGA interfaces with richer functionalities. These CLIs of the UGAPI offer typical functionalities required by end users for job management and file access to the different distributed computing infrastructures as well as local computing resources. We have also built a web interface for the particle therapy simulation and demonstrated the large scale calculation using the different infrastructures at the same time. In this paper, we would like to present how the web interface based on UGAPI and SAGA achieve more efficient utilization of computing resources over the different infrastructures with technical details and practical experiences.

  20. Distal radius osteotomy with volar locking plates based on computer simulation.

    PubMed

    Miyake, Junichi; Murase, Tsuyoshi; Moritomo, Hisao; Sugamoto, Kazuomi; Yoshikawa, Hideki

    2011-06-01

    Corrective osteotomy using dorsal plates and structural bone graft usually has been used for treating symptomatic distal radius malunions. However, the procedure is technically demanding and requires an extensive dorsal approach. Residual deformity is a relatively frequent complication of this technique. We evaluated the clinical applicability of a three-dimensional osteotomy using computer-aided design and manufacturing techniques with volar locking plates for distal radius malunions. Ten patients with metaphyseal radius malunions were treated. Corrective osteotomy was simulated with the help of three-dimensional bone surface models created using CT data. We simulated the most appropriate screw holes in the deformed radius using computer-aided design data of a locking plate. During surgery, using a custom-made surgical template, we predrilled the screw holes as simulated. After osteotomy, plate fixation using predrilled screw holes enabled automatic reduction of the distal radial fragment. Autogenous iliac cancellous bone was grafted after plate fixation. The median volar tilt, radial inclination, and ulnar variance improved from -20°, 13°, and 6 mm, respectively, before surgery to 12°, 24°, and 1 mm, respectively, after surgery. The median wrist flexion improved from 33° before surgery to 60° after surgery. The median wrist extension was 70° before surgery and 65° after surgery. All patients experienced wrist pain before surgery, which disappeared or decreased after surgery. Surgeons can operate precisely and easily using this advanced technique. It is a new treatment option for malunion of distal radius fractures.

  1. Computational Enzymology and Organophosphorus Degrading Enzymes: Promising Approaches Toward Remediation Technologies of Warfare Agents and Pesticides

    DOE PAGES

    Ramalho, Teodorico C.; DeCastro, Alexandre A.; Silva, Daniela R.; ...

    2015-08-26

    The re-emergence of chemical weapons as a global threat in hands of terrorist groups, together with an increasing number of pesticides intoxications and environmental contaminations worldwide, has called the attention of the scientific community for the need of improvement in the technologies for detoxification of organophosphorus (OP) compounds. A compelling strategy is the use of bioremediation by enzymes that are able to hydrolyze these molecules to harmless chemical species. Several enzymes have been studied and engineered for this purpose. However, their mechanisms of action are not well understood. Theoretical investigations may help elucidate important aspects of these mechanisms and helpmore » in the development of more efficient bio-remediators. In this review, we point out the major contributions of computational methodologies applied to enzyme based detoxification of OPs. Furthermore, we highlight the use of PTE, PON, DFP, and BuChE as enzymes used in OP detoxification process and how computational tools such as molecular docking, molecular dynamics simulations and combined quantum mechanical/molecular mechanics have and will continue to contribute to this very important area of research.The re-emergence of chemical weapons as a global threat in hands of terrorist groups, together with an increasing number of pesticides intoxications and environmental contaminations worldwide, has called the attention of the scientific community for the need of improvement in the technologies for detoxification of organophosphorus (OP) compounds. A compelling strategy is the use of bioremediation by enzymes that are able to hydrolyze these molecules to harmless chemical species. Several enzymes have been studied and engineered for this purpose. However, their mechanisms of action are not well understood. Theoretical investigations may help elucidate important aspects of these mechanisms and help in the development of more efficient bio-remediators. In this review, we point out the major contributions of computational methodologies applied to enzyme based detoxification of OPs. Furthermore, we highlight the use of PTE, PON, DFP, and BuChE as enzymes used in OP detoxification process and how computational tools such as molecular docking, molecular dynamics simulations and combined quantum mechanical/molecular mechanics have and will continue to contribute to this very important area of research.« less

  2. Computational Enzymology and Organophosphorus Degrading Enzymes: Promising Approaches Toward Remediation Technologies of Warfare Agents and Pesticides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramalho, Teodorico C.; DeCastro, Alexandre A.; Silva, Daniela R.

    The re-emergence of chemical weapons as a global threat in hands of terrorist groups, together with an increasing number of pesticides intoxications and environmental contaminations worldwide, has called the attention of the scientific community for the need of improvement in the technologies for detoxification of organophosphorus (OP) compounds. A compelling strategy is the use of bioremediation by enzymes that are able to hydrolyze these molecules to harmless chemical species. Several enzymes have been studied and engineered for this purpose. However, their mechanisms of action are not well understood. Theoretical investigations may help elucidate important aspects of these mechanisms and helpmore » in the development of more efficient bio-remediators. In this review, we point out the major contributions of computational methodologies applied to enzyme based detoxification of OPs. Furthermore, we highlight the use of PTE, PON, DFP, and BuChE as enzymes used in OP detoxification process and how computational tools such as molecular docking, molecular dynamics simulations and combined quantum mechanical/molecular mechanics have and will continue to contribute to this very important area of research.The re-emergence of chemical weapons as a global threat in hands of terrorist groups, together with an increasing number of pesticides intoxications and environmental contaminations worldwide, has called the attention of the scientific community for the need of improvement in the technologies for detoxification of organophosphorus (OP) compounds. A compelling strategy is the use of bioremediation by enzymes that are able to hydrolyze these molecules to harmless chemical species. Several enzymes have been studied and engineered for this purpose. However, their mechanisms of action are not well understood. Theoretical investigations may help elucidate important aspects of these mechanisms and help in the development of more efficient bio-remediators. In this review, we point out the major contributions of computational methodologies applied to enzyme based detoxification of OPs. Furthermore, we highlight the use of PTE, PON, DFP, and BuChE as enzymes used in OP detoxification process and how computational tools such as molecular docking, molecular dynamics simulations and combined quantum mechanical/molecular mechanics have and will continue to contribute to this very important area of research.« less

  3. Geochemical Reaction Mechanism Discovery from Molecular Simulation

    DOE PAGES

    Stack, Andrew G.; Kent, Paul R. C.

    2014-11-10

    Methods to explore reactions using computer simulation are becoming increasingly quantitative, versatile, and robust. In this review, a rationale for how molecular simulation can help build better geochemical kinetics models is first given. We summarize some common methods that geochemists use to simulate reaction mechanisms, specifically classical molecular dynamics and quantum chemical methods and discuss their strengths and weaknesses. Useful tools such as umbrella sampling and metadynamics that enable one to explore reactions are discussed. Several case studies wherein geochemists have used these tools to understand reaction mechanisms are presented, including water exchange and sorption on aqueous species and mineralmore » surfaces, surface charging, crystal growth and dissolution, and electron transfer. The impact that molecular simulation has had on our understanding of geochemical reactivity are highlighted in each case. In the future, it is anticipated that molecular simulation of geochemical reaction mechanisms will become more commonplace as a tool to validate and interpret experimental data, and provide a check on the plausibility of geochemical kinetic models.« less

  4. The cyclotron maser theory of AKR and Z-mode radiation. [Auroral Kilometric Radiation

    NASA Technical Reports Server (NTRS)

    Wu, C. S.

    1985-01-01

    The cyclotron maser mechanism which may be responsible for the generation of auroral kilometric radiation and Z-mode radiation is discussed. Emphasis is placed on the basic concepts of the cyclotron maser theory, particularly the relativistic effect of the cyclotron resonance condition. Recent development of the theory is reviewed. Finally, the results of a computer simulation study which helps to understand the nonlinear saturation of the maser instability are reported.

  5. Design Evaluation for Personnel, Training and Human Factors (DEPTH) Final Report.

    DTIC Science & Technology

    1998-01-17

    human activity was primarily intended to facilitate man-machine design analyses of complex systems. By importing computer aided design (CAD) data, the human figure models and analysis algorithms can help to ensure components can be seen, reached, lifted and removed by most maintainers. These simulations are also useful for logistics data capture, training, and task analysis. DEPTH was also found to be useful in obtaining task descriptions for technical

  6. A Virtual World for an Autonomous Underwater Vehicle

    DTIC Science & Technology

    1994-12-01

    LEGIBLY ON BLACK AND WHITE MICROFICHE. REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704 Public reporting burden for this collection of information is...event simulation. He and Man-Tak Shing also gave v\\aluable advice on the Ph.D. process. Mike Macedonia’s unparalleled understanding of computer...networks helped make an entire field intelligible. Dave Pratt blazed the trail with NPSNET, still the best virtual world around and still gaining on all

  7. Computer literacy for life sciences: helping the digital-era biology undergraduates face today's research.

    PubMed

    Smolinski, Tomasz G

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009.

  8. Improvement in precipitation-runoff model simulations by recalibration with basin-specific data, and subsequent model applications, Onondaga Lake Basin, Onondaga County, New York

    USGS Publications Warehouse

    Coon, William F.

    2011-01-01

    Simulation of streamflows in small subbasins was improved by adjusting model parameter values to match base flows, storm peaks, and storm recessions more precisely than had been done with the original model. Simulated recessional and low flows were either increased or decreased as appropriate for a given stream, and simulated peak flows generally were lowered in the revised model. The use of suspended-sediment concentrations rather than concentrations of the surrogate constituent, total suspended solids, resulted in increases in the simulated low-flow sediment concentrations and, in most cases, decreases in the simulated peak-flow sediment concentrations. Simulated orthophosphate concentrations in base flows generally increased but decreased for peak flows in selected headwater subbasins in the revised model. Compared with the original model, phosphorus concentrations simulated by the revised model were comparable in forested subbasins, generally decreased in developed and wetland-dominated subbasins, and increased in agricultural subbasins. A final revision to the model was made by the addition of the simulation of chloride (salt) concentrations in the Onondaga Creek Basin to help water-resource managers better understand the relative contributions of salt from multiple sources in this particular tributary. The calibrated revised model was used to (1) compute loading rates for the various land types that were simulated in the model, (2) conduct a watershed-management analysis that estimated the portion of the total load that was likely to be transported to Onondaga Lake from each of the modeled subbasins, (3) compute and assess chloride loads to Onondaga Lake from the Onondaga Creek Basin, and (4) simulate precolonization (forested) conditions in the basin to estimate the probable minimum phosphorus loads to the lake.

  9. State-and-transition models: Conceptual versus simulation perspectives, usefulness and breadth of use, and land management applications

    USGS Publications Warehouse

    Provencher, Louis; Frid, Leonardo; Czembor, Christina; Morisette, Jeffrey T.

    2016-01-01

    State-and-Transition Simulation Modeling (STSM) is a quantitative analysis method that can consolidate a wide array of resource management issues under a “what-if” scenario exercise. STSM can be seen as an ensemble of models, such as climate models, ecological models, and economic models that incorporate human dimensions and management options. This chapter presents STSM as a tool to help synthesize information on social–ecological systems and to investigate some of the management issues associated with exotic annual Bromus species, which have been described elsewhere in this book. Definitions, terminology, and perspectives on conceptual and computer-simulated stochastic state-and-transition models are given first, followed by a brief review of past STSM studies relevant to the management of Bromus species. A detailed case study illustrates the usefulness of STSM for land management. As a whole, this chapter is intended to demonstrate how STSM can help both managers and scientists: (a) determine efficient resource allocation for monitoring nonnative grasses; (b) evaluate sources of uncertainty in model simulation results involving expert opinion, and their consequences for management decisions; and (c) provide insight into the consequences of predicted local climate change effects on ecological systems invaded by exotic annual Bromus species.

  10. Study of a Compression-Molding Process for Ultraviolet Light-Emitting Diode Exposure Systems via Finite-Element Analysis

    PubMed Central

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-01-01

    Although wafer-level camera lenses are a very promising technology, problems such as warpage with time and non-uniform thickness of products still exist. In this study, finite element simulation was performed to simulate the compression molding process for acquiring the pressure distribution on the product on completion of the process and predicting the deformation with respect to the pressure distribution. Results show that the single-gate compression molding process significantly increases the pressure at the center of the product, whereas the multi-gate compressing molding process can effectively distribute the pressure. This study evaluated the non-uniform thickness of product and changes in the process parameters through computer simulations, which could help to improve the compression molding process. PMID:28617315

  11. Bubbling in vibrated granular films.

    PubMed

    Zamankhan, Piroz

    2011-02-01

    With the help of experiments, computer simulations, and a theoretical investigation, a general model is developed of the flow dynamics of dense granular media immersed in air in an intermediate regime where both collisional and frictional interactions may affect the flow behavior. The model is tested using the example of a system in which bubbles and solid structures are produced in granular films shaken vertically. Both experiments and large-scale, three-dimensional simulations of this system are performed. The experimental results are compared with the results of the simulation to verify the validity of the model. The data indicate evidence of formation of bubbles when peak acceleration relative to gravity exceeds a critical value Γ(b). The air-grain interfaces of bubblelike structures are found to exhibit fractal structure with dimension D=1.7±0.05.

  12. Electron cloud simulations for the main ring of J-PARC

    NASA Astrophysics Data System (ADS)

    Yee-Rendon, Bruce; Muto, Ryotaro; Ohmi, Kazuhito; Satou, Kenichirou; Tomizawa, Masahito; Toyama, Takeshi

    2017-07-01

    The simulation of beam instabilities is a helpful tool to evaluate potential threats against the machine protection of the high intensity beams. At Main Ring (MR) of J-PARC, signals related to the electron cloud have been observed during the slow beam extraction mode. Hence, several studies were conducted to investigate the mechanism that produces it, the results confirmed a strong dependence on the beam intensity and the bunch structure in the formation of the electron cloud, however, the precise explanation of its trigger conditions remains incomplete. To shed light on the problem, electron cloud simulations were done using an updated version of the computational model developed from previous works at KEK. The code employed the signals of the measurements to reproduce the events seen during the surveys.

  13. The development and application of CFD technology in mechanical engineering

    NASA Astrophysics Data System (ADS)

    Wei, Yufeng

    2017-12-01

    Computational Fluid Dynamics (CFD) is an analysis of the physical phenomena involved in fluid flow and heat conduction by computer numerical calculation and graphical display. The numerical method simulates the complexity of the physical problem and the precision of the numerical solution, which is directly related to the hardware speed of the computer and the hardware such as memory. With the continuous improvement of computer performance and CFD technology, it has been widely applied to the field of water conservancy engineering, environmental engineering and industrial engineering. This paper summarizes the development process of CFD, the theoretical basis, the governing equations of fluid mechanics, and introduces the various methods of numerical calculation and the related development of CFD technology. Finally, CFD technology in the mechanical engineering related applications are summarized. It is hoped that this review will help researchers in the field of mechanical engineering.

  14. Computational Approaches to Vestibular Research

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and some of our findings from their application. For example, our research has demonstrated that maculas of adult mammals retain the property of synaptic plasticity. Ribbon synapses increase numerically and undergo changes in type and distribution (p<0.0001) in type II hair cells after exposure to microgravity for as few as nine days. The finding of macular synaptic plasticity is pertinent to the clinic, and may help explain some. balance disorders in humans. The software used in our investigations will be demonstrated for those interested in applying it in their own research.

  15. Phase transformations at interfaces: Observations from atomistic modeling

    DOE PAGES

    Frolov, T.; Asta, M.; Mishin, Y.

    2016-10-01

    Here, we review the recent progress in theoretical understanding and atomistic computer simulations of phase transformations in materials interfaces, focusing on grain boundaries (GBs) in metallic systems. Recently developed simulation approaches enable the search and structural characterization of GB phases in single-component metals and binary alloys, calculation of thermodynamic properties of individual GB phases, and modeling of the effect of the GB phase transformations on GB kinetics. Atomistic simulations demonstrate that the GB transformations can be induced by varying the temperature, loading the GB with point defects, or varying the amount of solute segregation. The atomic-level understanding obtained from suchmore » simulations can provide input for further development of thermodynamics theories and continuous models of interface phase transformations while simultaneously serving as a testing ground for validation of theories and models. They can also help interpret and guide experimental work in this field.« less

  16. Computer simulations and experimental study on crash box of automobile in low speed collision

    NASA Astrophysics Data System (ADS)

    Liu, Yanjie; Ding, Lin; Yan, Shengyuan; Yang, Yongsheng

    2008-11-01

    Based on the problems of energy-absorbing components in the automobile low speed collision process, according to crash box frontal crash test in low speed as the example, the simulation analysis of crash box impact process was carried out by Hyper Mesh and LS-DYNA. Each parameter on the influence modeling was analyzed by mathematics analytical solution and test comparison, which guaranteed that the model was accurate. Combination of experiment and simulation result had determined the weakness part of crash box structure crashworthiness aspect, and improvement method of crash box crashworthiness was discussed. Through numerical simulation of the impact process of automobile crash box, the obtained analysis result was used to optimize the design of crash box. It was helpful to improve the vehicles structure and decrease the collision accident loss at most. And it was also provided a useful method for the further research on the automobile collision.

  17. Retinal Image Simulation of Subjective Refraction Techniques

    PubMed Central

    Perches, Sara; Collados, M. Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient’s response-guided refraction) is the most commonly used approach. In this context, this paper’s main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques—including Jackson’s Cross-Cylinder test (JCC)—relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software’s usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training. PMID:26938648

  18. Characterization of Protein Flexibility Using Small-Angle X-Ray Scattering and Amplified Collective Motion Simulations

    PubMed Central

    Wen, Bin; Peng, Junhui; Zuo, Xiaobing; Gong, Qingguo; Zhang, Zhiyong

    2014-01-01

    Large-scale flexibility within a multidomain protein often plays an important role in its biological function. Despite its inherent low resolution, small-angle x-ray scattering (SAXS) is well suited to investigate protein flexibility and determine, with the help of computational modeling, what kinds of protein conformations would coexist in solution. In this article, we develop a tool that combines SAXS data with a previously developed sampling technique called amplified collective motions (ACM) to elucidate structures of highly dynamic multidomain proteins in solution. We demonstrate the use of this tool in two proteins, bacteriophage T4 lysozyme and tandem WW domains of the formin-binding protein 21. The ACM simulations can sample the conformational space of proteins much more extensively than standard molecular dynamics (MD) simulations. Therefore, conformations generated by ACM are significantly better at reproducing the SAXS data than are those from MD simulations. PMID:25140431

  19. [Stimulation and evaluation on maxillary distraction osteogenesis using CASSOS 2001].

    PubMed

    Zhu, Min; Qiu, Wei-liu; Tang, You-sheng; Li, Qing-yun

    2002-09-01

    To simulate maxillary distraction osteogenesis and evaluate the change of soft and hard tissue before and after treatment, using Computer-Assisted Simulation System for Orthognathic Surgery( CASSOS 2001). A fourteen-year-old boy with severe maxillary hypoplasia, due to unilateral cleft lip and palate, was analysed by cephalometric analysis. The simulations of maxillary distraction osteogenesis (Le Fort I osteotomy and Le Fort II osteotomy) were re-analysed. After the treatment, cephalometric analysis was preformed again. The data were compared. The maxillary hypoplasia was well treated using maxillary distraction osteogenesis; Compared with Le fort I osteotomy, more satisfactory results can be obtained by Le fort I distraction osteogenesis. Maxillary distraction osteogenesis is a better way to treat severe maxillary hypoplasia with operated CLP than maxillary osteotomy. CASSOS 2001 can help surgeons and patients on simulation and evaluation of maxillary distraction osteogenesis, and on decision of treatment plan.

  20. Crystal MD: The massively parallel molecular dynamics software for metal with BCC structure

    NASA Astrophysics Data System (ADS)

    Hu, Changjun; Bai, He; He, Xinfu; Zhang, Boyao; Nie, Ningming; Wang, Xianmeng; Ren, Yingwen

    2017-02-01

    Material irradiation effect is one of the most important keys to use nuclear power. However, the lack of high-throughput irradiation facility and knowledge of evolution process, lead to little understanding of the addressed issues. With the help of high-performance computing, we could make a further understanding of micro-level-material. In this paper, a new data structure is proposed for the massively parallel simulation of the evolution of metal materials under irradiation environment. Based on the proposed data structure, we developed the new molecular dynamics software named Crystal MD. The simulation with Crystal MD achieved over 90% parallel efficiency in test cases, and it takes more than 25% less memory on multi-core clusters than LAMMPS and IMD, which are two popular molecular dynamics simulation software. Using Crystal MD, a two trillion particles simulation has been performed on Tianhe-2 cluster.

  1. Hurdles to Overcome to Model Carrington Class Events

    NASA Astrophysics Data System (ADS)

    Engel, M.; Henderson, M. G.; Jordanova, V. K.; Morley, S.

    2017-12-01

    Large geomagnetic storms pose a threat to both space and ground based infrastructure. In order to help mitigate that threat a better understanding of the specifics of these storms is required. Various computer models are being used around the world to analyze the magnetospheric environment, however they are largely inadequate for analyzing the large and extreme storm time environments. Here we report on the first steps towards expanding and robustifying the RAM-SCB inner magnetospheric model, used in conjunction with BATS-R-US and the Space Weather Modeling Framework, in order to simulate storms with Dst > -400. These results will then be used to help expand our modelling capabilities towards including Carrington-class events.

  2. In silico evolution of biochemical networks

    NASA Astrophysics Data System (ADS)

    Francois, Paul

    2010-03-01

    We use computational evolution to select models of genetic networks that can be built from a predefined set of parts to achieve a certain behavior. Selection is made with the help of a fitness defining biological functions in a quantitative way. This fitness has to be specific to a process, but general enough to find processes common to many species. Computational evolution favors models that can be built by incremental improvements in fitness rather than via multiple neutral steps or transitions through less fit intermediates. With the help of these simulations, we propose a kinetic view of evolution, where networks are rapidly selected along a fitness gradient. This mathematics recapitulates Darwin's original insight that small changes in fitness can rapidly lead to the evolution of complex structures such as the eye, and explain the phenomenon of convergent/parallel evolution of similar structures in independent lineages. We will illustrate these ideas with networks implicated in embryonic development and patterning of vertebrates and primitive insects.

  3. WE-DE-202-00: Connecting Radiation Physics with Computational Biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  4. Problems Related to Parallelization of CFD Algorithms on GPU, Multi-GPU and Hybrid Architectures

    NASA Astrophysics Data System (ADS)

    Biazewicz, Marek; Kurowski, Krzysztof; Ludwiczak, Bogdan; Napieraia, Krystyna

    2010-09-01

    Computational Fluid Dynamics (CFD) is one of the branches of fluid mechanics, which uses numerical methods and algorithms to solve and analyze fluid flows. CFD is used in various domains, such as oil and gas reservoir uncertainty analysis, aerodynamic body shapes optimization (e.g. planes, cars, ships, sport helmets, skis), natural phenomena analysis, numerical simulation for weather forecasting or realistic visualizations. CFD problem is very complex and needs a lot of computational power to obtain the results in a reasonable time. We have implemented a parallel application for two-dimensional CFD simulation with a free surface approximation (MAC method) using new hardware architectures, in particular multi-GPU and hybrid computing environments. For this purpose we decided to use NVIDIA graphic cards with CUDA environment due to its simplicity of programming and good computations performance. We used finite difference discretization of Navier-Stokes equations, where fluid is propagated over an Eulerian Grid. In this model, the behavior of the fluid inside the cell depends only on the properties of local, surrounding cells, therefore it is well suited for the GPU-based architecture. In this paper we demonstrate how to use efficiently the computing power of GPUs for CFD. Additionally, we present some best practices to help users analyze and improve the performance of CFD applications executed on GPU. Finally, we discuss various challenges around the multi-GPU implementation on the example of matrix multiplication.

  5. The preparedness level of final year medical students for an adequate medical approach to emergency cases: computer-based medical education in emergency medicine

    PubMed Central

    2014-01-01

    Background We aimed to observe the preparedness level of final year medical students in approaching emergencies by computer-based simulation training and evaluate the efficacy of the program. Methods A computer-based prototype simulation program (Lsim), designed by researchers from the medical education and computer science departments, was used to present virtual cases for medical learning. Fifty-four final year medical students from Ondokuz Mayis University School of Medicine attended an education program on June 20, 2012 and were trained with Lsim. Volunteer attendants completed a pre-test and post-test exam at the beginning and end of the course, respectively, on the same day. Results Twenty-nine of the 54 students who attended the course accepted to take the pre-test and post-test exams; 58.6% (n = 17) were female. In 10 emergency medical cases, an average of 3.9 correct medical approaches were performed in the pre-test and an average of 9.6 correct medical approaches were performed in the post-test (t = 17.18, P = 0.006). Conclusions This study’s results showed that the readiness level of students for an adequate medical approach to emergency cases was very low. Computer-based training could help in the adequate approach of students to various emergency cases. PMID:24386919

  6. WE-DE-202-02: Are Track Structure Simulations Truly Needed for Radiobiology at the Cellular and Tissue Levels?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  7. WE-DE-202-01: Connecting Nanoscale Physics to Initial DNA Damage Through Track Structure Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, J.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  8. Scientific Discovery through Advanced Computing in Plasma Science

    NASA Astrophysics Data System (ADS)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.

  9. The Flostation - an Immersive Cyberspace System

    NASA Technical Reports Server (NTRS)

    Park, Brian

    2006-01-01

    A flostation is a computer-controlled apparatus that, along with one or more computer(s) and other computer-controlled equipment, is part of an immersive cyberspace system. The system is said to be immersive in two senses of the word: (1) It supports the body in a modified form neutral posture experienced in zero gravity and (2) it is equipped with computer-controlled display equipment that helps to give the occupant of the chair a feeling of immersion in an environment that the system is designed to simulate. Neutral immersion was conceived during the Gemini program as a means of training astronauts for working in a zerogravity environment. Current derivatives include neutral-buoyancy tanks and the KC-135 airplane, each of which mimics the effects of zero gravity. While these have performed well in simulating the shorter-duration flights typical of the space program to date, a training device that can take astronauts to the next level will be needed for simulating longer-duration flights such as that of the International Space Station. The flostation is expected to satisfy this need. The flostation could also be adapted and replicated for use in commercial ventures ranging from home entertainment to medical treatment. The use of neutral immersion in the flostation enables the occupant to recline in an optimal posture of rest and meditation. This posture, combines savasana (known to practitioners of yoga) and a modified form of the neutral posture assumed by astronauts in outer space. As the occupant relaxes, awareness of the physical body is reduced. The neutral body posture, which can be maintained for hours without discomfort, is extended to the eyes, ears, and hands. The occupant can be surrounded with a full-field-of-view visual display and nearphone sound, and can be stimulated with full-body vibration and motion cueing. Once fully immersed, the occupant can use neutral hand controllers (that is, hand-posture sensors) to control various aspects of the simulated environment.

  10. A study to compute integrated dpa for neutron and ion irradiation environments using SRIM-2013

    NASA Astrophysics Data System (ADS)

    Saha, Uttiyoarnab; Devan, K.; Ganesan, S.

    2018-05-01

    Displacements per atom (dpa), estimated based on the standard Norgett-Robinson-Torrens (NRT) model, is used for assessing radiation damage effects in fast reactor materials. A computer code CRaD has been indigenously developed towards establishing the infrastructure to perform improved radiation damage studies in Indian fast reactors. We propose a method for computing multigroup neutron NRT dpa cross sections based on SRIM-2013 simulations. In this method, for each neutron group, the recoil or primary knock-on atom (PKA) spectrum and its average energy are first estimated with CRaD code from ENDF/B-VII.1. This average PKA energy forms the input for SRIM simulation, wherein the recoil atom is taken as the incoming ion on the target. The NRT-dpa cross section of iron computed with "Quick" Kinchin-Pease (K-P) option of SRIM-2013 is found to agree within 10% with the standard NRT-dpa values, if damage energy from SRIM simulation is used. SRIM-2013 NRT-dpa cross sections applied to estimate the integrated dpa for Fe, Cr and Ni are in good agreement with established computer codes and data. A similar study carried out for polyatomic material, SiC, shows encouraging results. In this case, it is observed that the NRT approach with average lattice displacement energy of 25 eV coupled with the damage energies from the K-P option of SRIM-2013 gives reliable displacement cross sections and integrated dpa for various reactor spectra. The source term of neutron damage can be equivalently determined in the units of dpa by simulating self-ion bombardment. This shows that the information of primary recoils obtained from CRaD can be reliably applied to estimate the integrated dpa and damage assessment studies in accelerator-based self-ion irradiation experiments of structural materials. This study would help to advance the investigation of possible correlations between the damages induced by ions and reactor neutrons.

  11. Evaluation of a computer-based educational intervention to improve medical teamwork and performance during simulated patient resuscitations.

    PubMed

    Fernandez, Rosemarie; Pearce, Marina; Grand, James A; Rench, Tara A; Jones, Kerin A; Chao, Georgia T; Kozlowski, Steve W J

    2013-11-01

    To determine the impact of a low-resource-demand, easily disseminated computer-based teamwork process training intervention on teamwork behaviors and patient care performance in code teams. A randomized comparison trial of computer-based teamwork training versus placebo training was conducted from August 2010 through March 2011. This study was conducted at the simulation suite within the Kado Family Clinical Skills Center, Wayne State University School of Medicine. Participants (n = 231) were fourth-year medical students and first-, second-, and third-year emergency medicine residents at Wayne State University. Each participant was assigned to a team of four to six members (nteams = 45). Teams were randomly assigned to receive either a 25-minute computer-based training module targeting appropriate resuscitation teamwork behaviors or a placebo training module. Teamwork behaviors and patient care behaviors were video recorded during high-fidelity simulated patient resuscitations and coded by trained raters blinded to condition assignment and study hypotheses. Teamwork behavior items (e.g., "chest radiograph findings communicated to team" and "team member assists with intubation preparation") were standardized before combining to create overall teamwork scores. Similarly, patient care items ("chest radiograph correctly interpreted"; "time to start of compressions") were standardized before combining to create overall patient care scores. Subject matter expert reviews and pilot testing of scenario content, teamwork items, and patient care items provided evidence of content validity. When controlling for team members' medically relevant experience, teams in the training condition demonstrated better teamwork (F [1, 42] = 4.81, p < 0.05; ηp = 10%) and patient care (F [1, 42] = 4.66, p < 0.05; ηp = 10%) than did teams in the placebo condition. Computer-based team training positively impacts teamwork and patient care during simulated patient resuscitations. This low-resource team training intervention may help to address the dissemination and sustainability issues associated with larger, more costly team training programs.

  12. Development and validation of a computational model to study the effect of foot constraint on ankle injury due to external rotation.

    PubMed

    Wei, Feng; Hunley, Stanley C; Powell, John W; Haut, Roger C

    2011-02-01

    Recent studies, using two different manners of foot constraint, potted and taped, document altered failure characteristics in the human cadaver ankle under controlled external rotation of the foot. The posterior talofibular ligament (PTaFL) was commonly injured when the foot was constrained in potting material, while the frequency of deltoid ligament injury was higher for the taped foot. In this study an existing multibody computational modeling approach was validated to include the influence of foot constraint, determine the kinematics of the joint under external foot rotation, and consequently obtain strains in various ligaments. It was hypothesized that the location of ankle injury due to excessive levels of external foot rotation is a function of foot constraint. The results from this model simulation supported this hypothesis and helped to explain the mechanisms of injury in the cadaver experiments. An excessive external foot rotation might generate a PTaFL injury for a rigid foot constraint, and an anterior deltoid ligament injury for a pliant foot constraint. The computational models may be further developed and modified to simulate the human response for different shoe designs, as well as on various athletic shoe-surface interfaces, so as to provide a computational basis for optimizing athletic performance with minimal injury risk.

  13. 3D Surgical Simulation

    PubMed Central

    Cevidanes, Lucia; Tucker, Scott; Styner, Martin; Kim, Hyungmin; Chapuis, Jonas; Reyes, Mauricio; Proffit, William; Turvey, Timothy; Jaskolka, Michael

    2009-01-01

    This paper discusses the development of methods for computer-aided jaw surgery. Computer-aided jaw surgery allows us to incorporate the high level of precision necessary for transferring virtual plans into the operating room. We also present a complete computer-aided surgery (CAS) system developed in close collaboration with surgeons. Surgery planning and simulation include construction of 3D surface models from Cone-beam CT (CBCT), dynamic cephalometry, semi-automatic mirroring, interactive cutting of bone and bony segment repositioning. A virtual setup can be used to manufacture positioning splints for intra-operative guidance. The system provides further intra-operative assistance with the help of a computer display showing jaw positions and 3D positioning guides updated in real-time during the surgical procedure. The CAS system aids in dealing with complex cases with benefits for the patient, with surgical practice, and for orthodontic finishing. Advanced software tools for diagnosis and treatment planning allow preparation of detailed operative plans, osteotomy repositioning, bone reconstructions, surgical resident training and assessing the difficulties of the surgical procedures prior to the surgery. CAS has the potential to make the elaboration of the surgical plan a more flexible process, increase the level of detail and accuracy of the plan, yield higher operative precision and control, and enhance documentation of cases. Supported by NIDCR DE017727, and DE018962 PMID:20816308

  14. NETIMIS: Dynamic Simulation of Health Economics Outcomes Using Big Data.

    PubMed

    Johnson, Owen A; Hall, Peter S; Hulme, Claire

    2016-02-01

    Many healthcare organizations are now making good use of electronic health record (EHR) systems to record clinical information about their patients and the details of their healthcare. Electronic data in EHRs is generated by people engaged in complex processes within complex environments, and their human input, albeit shaped by computer systems, is compromised by many human factors. These data are potentially valuable to health economists and outcomes researchers but are sufficiently large and complex enough to be considered part of the new frontier of 'big data'. This paper describes emerging methods that draw together data mining, process modelling, activity-based costing and dynamic simulation models. Our research infrastructure includes safe links to Leeds hospital's EHRs with 3 million secondary and tertiary care patients. We created a multidisciplinary team of health economists, clinical specialists, and data and computer scientists, and developed a dynamic simulation tool called NETIMIS (Network Tools for Intervention Modelling with Intelligent Simulation; http://www.netimis.com ) suitable for visualization of both human-designed and data-mined processes which can then be used for 'what-if' analysis by stakeholders interested in costing, designing and evaluating healthcare interventions. We present two examples of model development to illustrate how dynamic simulation can be informed by big data from an EHR. We found the tool provided a focal point for multidisciplinary team work to help them iteratively and collaboratively 'deep dive' into big data.

  15. Semi-physical simulation test for micro CMOS star sensor

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Zhang, Guang-jun; Jiang, Jie; Fan, Qiao-yun

    2008-03-01

    A designed star sensor must be extensively tested before launching. Testing star sensor requires complicated process with much time and resources input. Even observing sky on the ground is a challenging and time-consuming job, requiring complicated and expensive equipments, suitable time and location, and prone to be interfered by weather. And moreover, not all stars distributed on the sky can be observed by this testing method. Semi-physical simulation in laboratory reduces the testing cost and helps to debug, analyze and evaluate the star sensor system while developing the model. The test system is composed of optical platform, star field simulator, star field simulator computer, star sensor and the central data processing computer. The test system simulates the starlight with high accuracy and good parallelism, and creates static or dynamic image in FOV (Field of View). The conditions of the test are close to observing real sky. With this system, the test of a micro star tracker designed by Beijing University of Aeronautics and Astronautics has been performed successfully. Some indices including full-sky autonomous star identification time, attitude update frequency and attitude precision etc. meet design requirement of the star sensor. Error source of the testing system is also analyzed. It is concluded that the testing system is cost-saving, efficient, and contributes to optimizing the embed arithmetic, shortening the development cycle and improving engineering design processes.

  16. Findings and Challenges in Fine-Resolution Large-Scale Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Her, Y. G.

    2017-12-01

    Fine-resolution large-scale (FL) modeling can provide the overall picture of the hydrological cycle and transport while taking into account unique local conditions in the simulation. It can also help develop water resources management plans consistent across spatial scales by describing the spatial consequences of decisions and hydrological events extensively. FL modeling is expected to be common in the near future as global-scale remotely sensed data are emerging, and computing resources have been advanced rapidly. There are several spatially distributed models available for hydrological analyses. Some of them rely on numerical methods such as finite difference/element methods (FDM/FEM), which require excessive computing resources (implicit scheme) to manipulate large matrices or small simulation time intervals (explicit scheme) to maintain the stability of the solution, to describe two-dimensional overland processes. Others make unrealistic assumptions such as constant overland flow velocity to reduce the computational loads of the simulation. Thus, simulation efficiency often comes at the expense of precision and reliability in FL modeling. Here, we introduce a new FL continuous hydrological model and its application to four watersheds in different landscapes and sizes from 3.5 km2 to 2,800 km2 at the spatial resolution of 30 m on an hourly basis. The model provided acceptable accuracy statistics in reproducing hydrological observations made in the watersheds. The modeling outputs including the maps of simulated travel time, runoff depth, soil water content, and groundwater recharge, were animated, visualizing the dynamics of hydrological processes occurring in the watersheds during and between storm events. Findings and challenges were discussed in the context of modeling efficiency, accuracy, and reproducibility, which we found can be improved by employing advanced computing techniques and hydrological understandings, by using remotely sensed hydrological observations such as soil moisture and radar rainfall depth and by sharing the model and its codes in public domain, respectively.

  17. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  18. Computational approach on PEB process in EUV resist: multi-scale simulation

    NASA Astrophysics Data System (ADS)

    Kim, Muyoung; Moon, Junghwan; Choi, Joonmyung; Lee, Byunghoon; Jeong, Changyoung; Kim, Heebom; Cho, Maenghyo

    2017-03-01

    For decades, downsizing has been a key issue for high performance and low cost of semiconductor, and extreme ultraviolet lithography is one of the promising candidates to achieve the goal. As a predominant process in extreme ultraviolet lithography on determining resolution and sensitivity, post exposure bake has been mainly studied by experimental groups, but development of its photoresist is at the breaking point because of the lack of unveiled mechanism during the process. Herein, we provide theoretical approach to investigate underlying mechanism on the post exposure bake process in chemically amplified resist, and it covers three important reactions during the process: acid generation by photo-acid generator dissociation, acid diffusion, and deprotection. Density functional theory calculation (quantum mechanical simulation) was conducted to quantitatively predict activation energy and probability of the chemical reactions, and they were applied to molecular dynamics simulation for constructing reliable computational model. Then, overall chemical reactions were simulated in the molecular dynamics unit cell, and final configuration of the photoresist was used to predict the line edge roughness. The presented multiscale model unifies the phenomena of both quantum and atomic scales during the post exposure bake process, and it will be helpful to understand critical factors affecting the performance of the resulting photoresist and design the next-generation material.

  19. Understanding G Protein-Coupled Receptor Allostery via Molecular Dynamics Simulations: Implications for Drug Discovery.

    PubMed

    Basith, Shaherin; Lee, Yoonji; Choi, Sun

    2018-01-01

    Unraveling the mystery of protein allostery has been one of the greatest challenges in both structural and computational biology. However, recent advances in computational methods, particularly molecular dynamics (MD) simulations, have led to its utility as a powerful and popular tool for the study of protein allostery. By capturing the motions of a protein's constituent atoms, simulations can enable the discovery of allosteric hot spots and the determination of the mechanistic basis for allostery. These structural and dynamic studies can provide a foundation for a wide range of applications, including rational drug design and protein engineering. In our laboratory, the use of MD simulations and network analysis assisted in the elucidation of the allosteric hotspots and intracellular signal transduction of G protein-coupled receptors (GPCRs), primarily on one of the adenosine receptor subtypes, A 2A adenosine receptor (A 2A AR). In this chapter, we describe a method for calculating the map of allosteric signal flow in different GPCR conformational states and illustrate how these concepts have been utilized in understanding the mechanism of GPCR allostery. These structural studies will provide valuable insights into the allosteric and orthosteric modulations that would be of great help to design novel drugs targeting GPCRs in pathological states.

  20. Simulation model for port shunting yards

    NASA Astrophysics Data System (ADS)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  1. CFD Modelling of a Quadrupole Vortex Inside a Cylindrical Channel for Research into Advanced Hybrid Rocket Designs

    NASA Astrophysics Data System (ADS)

    Godfrey, B.; Majdalani, J.

    2014-11-01

    This study relies on computational fluid dynamics (CFD) tools to analyse a possible method for creating a stable quadrupole vortex within a simulated, circular-port, cylindrical rocket chamber. A model of the vortex generator is created in a SolidWorks CAD program and then the grid is generated using the Pointwise mesh generation software. The non-reactive flowfield is simulated using an open source computational program, Stanford University Unstructured (SU2). Subsequent analysis and visualization are performed using ParaView. The vortex generation approach that we employ consists of four tangentially injected monopole vortex generators that are arranged symmetrically with respect to the center of the chamber in such a way to produce a quadrupole vortex with a common downwash. The present investigation focuses on characterizing the flow dynamics so that future investigations can be undertaken with increasing levels of complexity. Our CFD simulations help to elucidate the onset of vortex filaments within the monopole tubes, and the evolution of quadrupole vortices downstream of the injection faceplate. Our results indicate that the quadrupole vortices produced using the present injection pattern can become quickly unstable to the extent of dissipating soon after being introduced into simulated rocket chamber. We conclude that a change in the geometrical configuration will be necessary to produce more stable quadrupoles.

  2. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    NASA Astrophysics Data System (ADS)

    Chakravarthy, Srinivas R.; Rumyantsev, Alexander

    2018-03-01

    Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  3. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB.

    PubMed

    Sinha, Shriprakash

    2016-12-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.

  4. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1997-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the CAGI: Computer Aided Grid Interface system. The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and/or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  5. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1996-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the Computer Aided Grid Interface system (CAGI). The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and / or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  6. Reverse time migration by Krylov subspace reduced order modeling

    NASA Astrophysics Data System (ADS)

    Basir, Hadi Mahdavi; Javaherian, Abdolrahim; Shomali, Zaher Hossein; Firouz-Abadi, Roohollah Dehghani; Gholamy, Shaban Ali

    2018-04-01

    Imaging is a key step in seismic data processing. To date, a myriad of advanced pre-stack depth migration approaches have been developed; however, reverse time migration (RTM) is still considered as the high-end imaging algorithm. The main limitations associated with the performance cost of reverse time migration are the intensive computation of the forward and backward simulations, time consumption, and memory allocation related to imaging condition. Based on the reduced order modeling, we proposed an algorithm, which can be adapted to all the aforementioned factors. Our proposed method benefit from Krylov subspaces method to compute certain mode shapes of the velocity model computed by as an orthogonal base of reduced order modeling. Reverse time migration by reduced order modeling is helpful concerning the highly parallel computation and strongly reduces the memory requirement of reverse time migration. The synthetic model results showed that suggested method can decrease the computational costs of reverse time migration by several orders of magnitudes, compared with reverse time migration by finite element method.

  7. Computational compliance criteria in water hammer modelling

    NASA Astrophysics Data System (ADS)

    Urbanowicz, Kamil

    2017-10-01

    Among many numerical methods (finite: difference, element, volume etc.) used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC) is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC), which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL) number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  8. Real-Time Climate Simulations in the Interactive 3D Game Universe Sandbox ²

    NASA Astrophysics Data System (ADS)

    Goldenson, N. L.

    2014-12-01

    Exploration in an open-ended computer game is an engaging way to explore climate and climate change. Everyone can explore physical models with real-time visualization in the educational simulator Universe Sandbox ² (universesandbox.com/2), which includes basic climate simulations on planets. I have implemented a time-dependent, one-dimensional meridional heat transport energy balance model to run and be adjustable in real time in the midst of a larger simulated system. Universe Sandbox ² is based on the original game - at its core a gravity simulator - with other new physically-based content for stellar evolution, and handling collisions between bodies. Existing users are mostly science enthusiasts in informal settings. We believe that this is the first climate simulation to be implemented in a professionally developed computer game with modern 3D graphical output in real time. The type of simple climate model we've adopted helps us depict the seasonal cycle and the more drastic changes that come from changing the orbit or other external forcings. Users can alter the climate as the simulation is running by altering the star(s) in the simulation, dragging to change orbits and obliquity, adjusting the climate simulation parameters directly or changing other properties like CO2 concentration that affect the model parameters in representative ways. Ongoing visuals of the expansion and contraction of sea ice and snow-cover respond to the temperature calculations, and make it accessible to explore a variety of scenarios and intuitive to understand the output. Variables like temperature can also be graphed in real time. We balance computational constraints with the ability to capture the physical phenomena we wish to visualize, giving everyone access to a simple open-ended meridional energy balance climate simulation to explore and experiment with. The software lends itself to labs at a variety of levels about climate concepts including seasons, the Greenhouse effect, reservoirs and flows, albedo feedback, Snowball Earth, climate sensitivity, and model experiment design. Climate calculations are extended to Mars with some modifications to the Earth climate component, and could be used in lessons about the Mars atmosphere, and exploring scenarios of Mars climate history.

  9. User Interface Developed for Controls/CFD Interdisciplinary Research

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The NASA Lewis Research Center, in conjunction with the University of Akron, is developing analytical methods and software tools to create a cross-discipline "bridge" between controls and computational fluid dynamics (CFD) technologies. Traditionally, the controls analyst has used simulations based on large lumping techniques to generate low-order linear models convenient for designing propulsion system controls. For complex, high-speed vehicles such as the High Speed Civil Transport (HSCT), simulations based on CFD methods are required to capture the relevant flow physics. The use of CFD should also help reduce the development time and costs associated with experimentally tuning the control system. The initial application for this research is the High Speed Civil Transport inlet control problem. A major aspect of this research is the development of a controls/CFD interface for non-CFD experts, to facilitate the interactive operation of CFD simulations and the extraction of reduced-order, time-accurate models from CFD results. A distributed computing approach for implementing the interface is being explored. Software being developed as part of the Integrated CFD and Experiments (ICE) project provides the basis for the operating environment, including run-time displays and information (data base) management. Message-passing software is used to communicate between the ICE system and the CFD simulation, which can reside on distributed, parallel computing systems. Initially, the one-dimensional Large-Perturbation Inlet (LAPIN) code is being used to simulate a High Speed Civil Transport type inlet. LAPIN can model real supersonic inlet features, including bleeds, bypasses, and variable geometry, such as translating or variable-ramp-angle centerbodies. Work is in progress to use parallel versions of the multidimensional NPARC code.

  10. Comparison of groundwater flow in Southern California coastal aquifers

    USGS Publications Warehouse

    Hanson, Randall T.; Izbicki, John A.; Reichard, Eric G.; Edwards, Brian D.; Land, Michael; Martin, Peter

    2009-01-01

    Maintaining the sustainability of Southern California coastal aquifers requires joint management of surface water and groundwater (conjunctive use). This requires new data collection and analyses (including research drilling, modern geohydrologic investigations, and development of detailed computer groundwater models that simulate the supply and demand components separately), implementation of new facilities (including spreading and injection facilities for artificial recharge), and establishment of new institutions and policies that help to sustain the water resources and better manage regional development.

  11. The NASA Ames Hypersonic Combustor-Model Inlet CFD Simulations and Experimental Comparisons

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Tokarcik-Polsky, S.; Deiwert, G. S.; Edwards, Thomas A. (Technical Monitor)

    1995-01-01

    Computations have been performed on a three-dimensional inlet associated with the NASA Ames combustor model for the hypersonic propulsion experiment in the 16-inch shock tunnel. The 3-dimensional inlet was designed to have the combustor inlet flow nearly two-dimensional and of sufficient mass flow necessary for combustion. The 16-inch shock tunnel experiment is a short duration test with test time of the order of milliseconds. The flow through the inlet is in chemical non-equilibrium. Two test entries have been completed and limited experimental results for the inlet region of the combustor-model are available. A number of CFD simulations, with various levels of simplifications such as 2-D simulations, 3-D simulations with and without chemical reactions, simulations with and without turbulent conditions, etc., have been performed. These simulations have helped determine the model inlet flow characteristics and the important factors that affect the combustor inlet flow and the sensitivity of the flow field to these simplifications. In the proposed paper, CFD modeling of the hypersonic inlet, results from the simulations and comparison with available experimental results will be presented.

  12. Simulation and animation of sensor-driven robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.; Trivedi, M.M.; Bidlack, C.R.

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aide the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the usersmore » visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.« less

  13. Visualization of simulated urban spaces: inferring parameterized generation of streets, parcels, and aerial imagery.

    PubMed

    Vanegas, Carlos A; Aliaga, Daniel G; Benes, Bedrich; Waddell, Paul

    2009-01-01

    Urban simulation models and their visualization are used to help regional planning agencies evaluate alternative transportation investments, land use regulations, and environmental protection policies. Typical urban simulations provide spatially distributed data about number of inhabitants, land prices, traffic, and other variables. In this article, we build on a synergy of urban simulation, urban visualization, and computer graphics to automatically infer an urban layout for any time step of the simulation sequence. In addition to standard visualization tools, our method gathers data of the original street network, parcels, and aerial imagery and uses the available simulation results to infer changes to the original urban layout and produce a new and plausible layout for the simulation results. In contrast with previous work, our approach automatically updates the layout based on changes in the simulation data and thus can scale to a large simulation over many years. The method in this article offers a substantial step forward in building integrated visualization and behavioral simulation systems for use in community visioning, planning, and policy analysis. We demonstrate our method on several real cases using a 200 GB database for a 16,300 km2 area surrounding Seattle.

  14. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul

    2015-11-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.

  15. Tuning the critical solution temperature of polymers by copolymerization

    NASA Astrophysics Data System (ADS)

    Schulz, Bernhard; Chudoba, Richard; Heyda, Jan; Dzubiella, Joachim

    2015-12-01

    We study statistical copolymerization effects on the upper critical solution temperature (CST) of generic homopolymers by means of coarse-grained Langevin dynamics computer simulations and mean-field theory. Our systematic investigation reveals that the CST can change monotonically or non-monotonically with copolymerization, as observed in experimental studies, depending on the degree of non-additivity of the monomer (A-B) cross-interactions. The simulation findings are confirmed and qualitatively explained by a combination of a two-component Flory-de Gennes model for polymer collapse and a simple thermodynamic expansion approach. Our findings provide some rationale behind the effects of copolymerization and may be helpful for tuning CST behavior of polymers in soft material design.

  16. Simulated Breeding

    NASA Astrophysics Data System (ADS)

    Unemi, Tatsuo

    This chapter describes a basic framework of simulated breeding, a type of interactive evolutionary computing to breed artifacts, whose origin is Blind Watchmaker by Dawkins. These methods make it easy for humans to design a complex object adapted to his/her subjective criteria, just similarly to agricultural products we have been developing over thousands of years. Starting from randomly initialized genome, the solution candidates are improved through several generations with artificial selection. The graphical user interface helps the process of breeding with techniques of multifield user interface and partial breeding. The former improves the diversity of individuals that prevents being trapped at local optimum. The latter makes it possible for the user to fix features he/she already satisfied. These methods were examined through artistic applications by the author: SBART for graphics art and SBEAT for music. Combining with a direct genome editor and exportation to another graphical or musical tool on the computer, they can be powerful tools for artistic creation. These systems may contribute to the creation of a type of new culture.

  17. The development of expertise using an intelligent computer-aided training system

    NASA Technical Reports Server (NTRS)

    Johnson, Debra Steele

    1991-01-01

    An initial examination was conducted of an Intelligent Tutoring System (ITS) developed for use in industry. The ITS, developed by NASA, simulated a satellite deployment task. More specifically, the PD (Payload Assist Module Deployment)/ICAT (Intelligent Computer Aided Training) System simulated a nominal Payload Assist Module (PAM) deployment. The development of expertise on this task was examined using three Flight Dynamics Officer (FDO) candidates who has no previous experience with this task. The results indicated that performance improved rapidly until Trial 5, followed by more gradual improvements through Trial 12. The performance dimensions measured included performance speed, actions completed, errors, help required, and display fields checked. Suggestions for further refining the software and for deciding when to expose trainees to more difficult task scenarios are discussed. Further, the results provide an initial demonstration of the effectiveness of the PD/ICAT system in training the nominal PAM deployment task and indicate the potential benefits of using ITS's for training other FDO tasks.

  18. On the interpretation of kernels - Computer simulation of responses to impulse pairs

    NASA Technical Reports Server (NTRS)

    Hung, G.; Stark, L.; Eykhoff, P.

    1983-01-01

    A method is presented for the use of a unit impulse response and responses to impulse pairs of variable separation in the calculation of the second-degree kernels of a quadratic system. A quadratic system may be built from simple linear terms of known dynamics and a multiplier. Computer simulation results on quadratic systems with building elements of various time constants indicate reasonably that the larger time constant term before multiplication dominates in the envelope of the off-diagonal kernel curves as these move perpendicular to and away from the main diagonal. The smaller time constant term before multiplication combines with the effect of the time constant after multiplication to dominate in the kernel curves in the direction of the second-degree impulse response, i.e., parallel to the main diagonal. Such types of insight may be helpful in recognizing essential aspects of (second-degree) kernels; they may be used in simplifying the model structure and, perhaps, add to the physical/physiological understanding of the underlying processes.

  19. A Percolation Model for Fracking

    NASA Astrophysics Data System (ADS)

    Norris, J. Q.; Turcotte, D. L.; Rundle, J. B.

    2014-12-01

    Developments in fracking technology have enabled the recovery of vast reserves of oil and gas; yet, there is very little publicly available scientific research on fracking. Traditional reservoir simulator models for fracking are computationally expensive, and require many hours on a supercomputer to simulate a single fracking treatment. We have developed a computationally inexpensive percolation model for fracking that can be used to understand the processes and risks associated with fracking. In our model, a fluid is injected from a single site and a network of fractures grows from the single site. The fracture network grows in bursts, the failure of a relatively strong bond followed by the failure of a series of relatively weak bonds. These bursts display similarities to micro seismic events observed during a fracking treatment. The bursts follow a power-law (Gutenburg-Richter) frequency-size distribution and have growth rates similar to observed earthquake moment rates. These are quantifiable features that can be compared to observed microseismicity to help understand the relationship between observed microseismicity and the underlying fracture network.

  20. Hydrodynamic and Longitudinal Impedance Analysis of Cerebrospinal Fluid Dynamics at the Craniovertebral Junction in Type I Chiari Malformation

    PubMed Central

    Martin, Bryn A.; Kalata, Wojciech; Shaffer, Nicholas; Fischer, Paul; Luciano, Mark; Loth, Francis

    2013-01-01

    Elevated or reduced velocity of cerebrospinal fluid (CSF) at the craniovertebral junction (CVJ) has been associated with type I Chiari malformation (CMI). Thus, quantification of hydrodynamic parameters that describe the CSF dynamics could help assess disease severity and surgical outcome. In this study, we describe the methodology to quantify CSF hydrodynamic parameters near the CVJ and upper cervical spine utilizing subject-specific computational fluid dynamics (CFD) simulations based on in vivo MRI measurements of flow and geometry. Hydrodynamic parameters were computed for a healthy subject and two CMI patients both pre- and post-decompression surgery to determine the differences between cases. For the first time, we present the methods to quantify longitudinal impedance (LI) to CSF motion, a subject-specific hydrodynamic parameter that may have value to help quantify the CSF flow blockage severity in CMI. In addition, the following hydrodynamic parameters were quantified for each case: maximum velocity in systole and diastole, Reynolds and Womersley number, and peak pressure drop during the CSF cardiac flow cycle. The following geometric parameters were quantified: cross-sectional area and hydraulic diameter of the spinal subarachnoid space (SAS). The mean values of the geometric parameters increased post-surgically for the CMI models, but remained smaller than the healthy volunteer. All hydrodynamic parameters, except pressure drop, decreased post-surgically for the CMI patients, but remained greater than in the healthy case. Peak pressure drop alterations were mixed. To our knowledge this study represents the first subject-specific CFD simulation of CMI decompression surgery and quantification of LI in the CSF space. Further study in a larger patient and control group is needed to determine if the presented geometric and/or hydrodynamic parameters are helpful for surgical planning. PMID:24130704

  1. Spectroscopic fingerprints of toroidal nuclear quantum delocalization via ab initio path integral simulations.

    PubMed

    Schütt, Ole; Sebastiani, Daniel

    2013-04-05

    We investigate the quantum-mechanical delocalization of hydrogen in rotational symmetric molecular systems. To this purpose, we perform ab initio path integral molecular dynamics simulations of a methanol molecule to characterize the quantum properties of hydrogen atoms in a representative system by means of their real-space and momentum-space densities. In particular, we compute the spherically averaged momentum distribution n(k) and the pseudoangular momentum distribution n(kθ). We interpret our results by comparing them to path integral samplings of a bare proton in an ideal torus potential. We find that the hydroxyl hydrogen exhibits a toroidal delocalization, which leads to characteristic fingerprints in the line shapes of the momentum distributions. We can describe these specific spectroscopic patterns quantitatively and compute their onset as a function of temperature and potential energy landscape. The delocalization patterns in the projected momentum distribution provide a promising computational tool to address the intriguing phenomenon of quantum delocalization in condensed matter and its spectroscopic characterization. As the momentum distribution n(k) is also accessible through Nuclear Compton Scattering experiments, our results will help to interpret and understand future measurements more thoroughly. Copyright © 2012 Wiley Periodicals, Inc.

  2. Fast and Accurate Simulation Technique for Large Irregular Arrays

    NASA Astrophysics Data System (ADS)

    Bui-Van, Ha; Abraham, Jens; Arts, Michel; Gueuning, Quentin; Raucy, Christopher; Gonzalez-Ovejero, David; de Lera Acedo, Eloy; Craeye, Christophe

    2018-04-01

    A fast full-wave simulation technique is presented for the analysis of large irregular planar arrays of identical 3-D metallic antennas. The solution method relies on the Macro Basis Functions (MBF) approach and an interpolatory technique to compute the interactions between MBFs. The Harmonic-polynomial (HARP) model is established for the near-field interactions in a modified system of coordinates. For extremely large arrays made of complex antennas, two approaches assuming a limited radius of influence for mutual coupling are considered: one is based on a sparse-matrix LU decomposition and the other one on a tessellation of the array in the form of overlapping sub-arrays. The computation of all embedded element patterns is sped up with the help of the non-uniform FFT algorithm. Extensive validations are shown for arrays of log-periodic antennas envisaged for the low-frequency SKA (Square Kilometer Array) radio-telescope. The analysis of SKA stations with such a large number of elements has not been treated yet in the literature. Validations include comparison with results obtained with commercial software and with experiments. The proposed method is particularly well suited to array synthesis, in which several orders of magnitude can be saved in terms of computation time.

  3. Comparative analysis of ventricular assist devices (POLVAD and POLVAD_EXT) based on multiscale FEM model.

    PubMed

    Milenin, Andrzej; Kopernik, Magdalena

    2011-01-01

    The prosthesis - pulsatory ventricular assist device (VAD) - is made of polyurethane (PU) and biocompatible TiN deposited by pulsed laser deposition (PLD) method. The paper discusses the numerical modelling and computer-aided design of such an artificial organ. Two types of VADs: POLVAD and POLVAD_EXT are investigated. The main tasks and assumptions of the computer program developed are presented. The multiscale model of VAD based on finite element method (FEM) is introduced and the analysis of the stress-strain state in macroscale for the blood chamber in both versions of VAD is shown, as well as the verification of the results calculated by applying ABAQUS, a commercial FEM code. The FEM code developed is based on a new approach to the simulation of multilayer materials obtained by using PLD method. The model in microscale includes two components, i.e., model of initial stresses (residual stress) caused by the deposition process and simulation of active loadings observed in the blood chamber of POLVAD and POLVAD_EXT. The computed distributions of stresses and strains in macro- and microscales are helpful in defining precisely the regions of blood chamber, which can be defined as the failure-source areas.

  4. Methods for improving simulations of biological systems: systemic computation and fractal proteins

    PubMed Central

    Bentley, Peter J.

    2009-01-01

    Modelling and simulation are becoming essential for new fields such as synthetic biology. Perhaps the most important aspect of modelling is to follow a clear design methodology that will help to highlight unwanted deficiencies. The use of tools designed to aid the modelling process can be of benefit in many situations. In this paper, the modelling approach called systemic computation (SC) is introduced. SC is an interaction-based language, which enables individual-based expression and modelling of biological systems, and the interactions between them. SC permits a precise description of a hypothetical mechanism to be written using an intuitive graph-based or a calculus-based notation. The same description can then be directly run as a simulation, merging the hypothetical mechanism and the simulation into the same entity. However, even when using well-designed modelling tools to produce good models, the best model is not always the most accurate one. Frequently, computational constraints or lack of data make it infeasible to model an aspect of biology. Simplification may provide one way forward, but with inevitable consequences of decreased accuracy. Instead of attempting to replace an element with a simpler approximation, it is sometimes possible to substitute the element with a different but functionally similar component. In the second part of this paper, this modelling approach is described and its advantages are summarized using an exemplar: the fractal protein model. Finally, the paper ends with a discussion of good biological modelling practice by presenting lessons learned from the use of SC and the fractal protein model. PMID:19324681

  5. A Multi-Agent Approach to the Simulation of Robotized Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Foit, K.; Gwiazda, A.; Banaś, W.

    2016-08-01

    The recent years of eventful industry development, brought many competing products, addressed to the same market segment. The shortening of a development cycle became a necessity if the company would like to be competitive. Because of switching to the Intelligent Manufacturing model the industry search for new scheduling algorithms, while the traditional ones do not meet the current requirements. The agent-based approach has been considered by many researchers as an important way of evolution of modern manufacturing systems. Due to the properties of the multi-agent systems, this methodology is very helpful during creation of the model of production system, allowing depicting both processing and informational part. The complexity of such approach makes the analysis impossible without the computer assistance. Computer simulation still uses a mathematical model to recreate a real situation, but nowadays the 2D or 3D virtual environments or even virtual reality have been used for realistic illustration of the considered systems. This paper will focus on robotized manufacturing system and will present the one of possible approaches to the simulation of such systems. The selection of multi-agent approach is motivated by the flexibility of this solution that offers the modularity, robustness and autonomy.

  6. Estimating short-period dynamics using an extended Kalman filter

    NASA Technical Reports Server (NTRS)

    Bauer, Jeffrey E.; Andrisani, Dominick

    1990-01-01

    An extended Kalman filter (EKF) is used to estimate the parameters of a low-order model from aircraft transient response data. The low-order model is a state space model derived from the short-period approximation of the longitudinal aircraft dynamics. The model corresponds to the pitch rate to stick force transfer function currently used in flying qualities analysis. Because of the model chosen, handling qualities information is also obtained. The parameters are estimated from flight data as well as from a six-degree-of-freedom, nonlinear simulation of the aircraft. These two estimates are then compared and the discrepancies noted. The low-order model is able to satisfactorily match both flight data and simulation data from a high-order computer simulation. The parameters obtained from the EKF analysis of flight data are compared to those obtained using frequency response analysis of the flight data. Time delays and damping ratios are compared and are in agreement. This technique demonstrates the potential to determine, in near real time, the extent of differences between computer models and the actual aircraft. Precise knowledge of these differences can help to determine the flying qualities of a test aircraft and lead to more efficient envelope expansion.

  7. High accuracy mantle convection simulation through modern numerical methods - II: realistic models and problems

    NASA Astrophysics Data System (ADS)

    Heister, Timo; Dannberg, Juliane; Gassmöller, Rene; Bangerth, Wolfgang

    2017-08-01

    Computations have helped elucidate the dynamics of Earth's mantle for several decades already. The numerical methods that underlie these simulations have greatly evolved within this time span, and today include dynamically changing and adaptively refined meshes, sophisticated and efficient solvers, and parallelization to large clusters of computers. At the same time, many of the methods - discussed in detail in a previous paper in this series - were developed and tested primarily using model problems that lack many of the complexities that are common to the realistic models our community wants to solve today. With several years of experience solving complex and realistic models, we here revisit some of the algorithm designs of the earlier paper and discuss the incorporation of more complex physics. In particular, we re-consider time stepping and mesh refinement algorithms, evaluate approaches to incorporate compressibility, and discuss dealing with strongly varying material coefficients, latent heat, and how to track chemical compositions and heterogeneities. Taken together and implemented in a high-performance, massively parallel code, the techniques discussed in this paper then allow for high resolution, 3-D, compressible, global mantle convection simulations with phase transitions, strongly temperature dependent viscosity and realistic material properties based on mineral physics data.

  8. Fast Virtual Stenting with Active Contour Models in Intracranical Aneurysm

    PubMed Central

    Zhong, Jingru; Long, Yunling; Yan, Huagang; Meng, Qianqian; Zhao, Jing; Zhang, Ying; Yang, Xinjian; Li, Haiyun

    2016-01-01

    Intracranial stents are becoming increasingly a useful option in the treatment of intracranial aneurysms (IAs). Image simulation of the releasing stent configuration together with computational fluid dynamics (CFD) simulation prior to intervention will help surgeons optimize intervention scheme. This paper proposed a fast virtual stenting of IAs based on active contour model (ACM) which was able to virtually release stents within any patient-specific shaped vessel and aneurysm models built on real medical image data. In this method, an initial stent mesh was generated along the centerline of the parent artery without the need for registration between the stent contour and the vessel. Additionally, the diameter of the initial stent volumetric mesh was set to the maximum inscribed sphere diameter of the parent artery to improve the stenting accuracy and save computational cost. At last, a novel criterion for terminating virtual stent expanding that was based on the collision detection of the axis aligned bounding boxes was applied, making the stent expansion free of edge effect. The experiment results of the virtual stenting and the corresponding CFD simulations exhibited the efficacy and accuracy of the ACM based method, which are valuable to intervention scheme selection and therapy plan confirmation. PMID:26876026

  9. Simulation Data Management - Requirements and Design Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clay, Robert L.; Friedman-Hill, Ernest J.; Gibson, Marcus J.

    Simulation Data Management (SDM), the ability to securely organize, archive, and share analysis models and the artifacts used to create them, is a fundamental requirement for modern engineering analysis based on computational simulation. We have worked separately to provide secure, network SDM services to engineers and scientists at our respective laboratories for over a decade. We propose to leverage our experience and lessons learned to help develop and deploy a next-generation SDM service as part of a multi-laboratory team. This service will be portable across multiple sites and platforms, and will be accessible via a range of command-line tools andmore » well-documented APIs. In this document, we’ll review our high-level and low-level requirements for such a system, review one existing system, and briefly discuss our proposed implementation.« less

  10. Structural Performance’s Optimally Analysing and Implementing Based on ANSYS Technology

    NASA Astrophysics Data System (ADS)

    Han, Na; Wang, Xuquan; Yue, Haifang; Sun, Jiandong; Wu, Yongchun

    2017-06-01

    Computer-aided Engineering (CAE) is a hotspot both in academic field and in modern engineering practice. Analysis System(ANSYS) simulation software for its excellent performance become outstanding one in CAE family, it is committed to the innovation of engineering simulation to help users to shorten the design process, improve product innovation and performance. Aimed to explore a structural performance’s optimally analyzing model for engineering enterprises, this paper introduced CAE and its development, analyzed the necessity for structural optimal analysis as well as the framework of structural optimal analysis on ANSYS Technology, used ANSYS to implement a reinforced concrete slab structural performance’s optimal analysis, which was display the chart of displacement vector and the chart of stress intensity. Finally, this paper compared ANSYS software simulation results with the measured results,expounded that ANSYS is indispensable engineering calculation tools.

  11. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  12. Astronomy Aid

    NASA Technical Reports Server (NTRS)

    1995-01-01

    As a Jet Propulsion Laboratory astronomer, John D. Callahan developed a computer program called Multimission Interactive Planner (MIP) to help astronomers analyze scientific and optical data collected on the Voyager's Grand Tour. The commercial version of the program called XonVu is published by XonTech, Inc. Callahan has since developed two more advanced programs based on MIP technology, Grand Tour and Jovian Traveler, which simulate Voyager and Giotto missions. The software allows astronomers and space novices to view the objects seen by the spacecraft, manipulating perspective, distance and field of vision.

  13. ASP-G: an ASP-based method for finding attractors in genetic regulatory networks

    PubMed Central

    Mushthofa, Mushthofa; Torres, Gustavo; Van de Peer, Yves; Marchal, Kathleen; De Cock, Martine

    2014-01-01

    Motivation: Boolean network models are suitable to simulate GRNs in the absence of detailed kinetic information. However, reducing the biological reality implies making assumptions on how genes interact (interaction rules) and how their state is updated during the simulation (update scheme). The exact choice of the assumptions largely determines the outcome of the simulations. In most cases, however, the biologically correct assumptions are unknown. An ideal simulation thus implies testing different rules and schemes to determine those that best capture an observed biological phenomenon. This is not trivial because most current methods to simulate Boolean network models of GRNs and to compute their attractors impose specific assumptions that cannot be easily altered, as they are built into the system. Results: To allow for a more flexible simulation framework, we developed ASP-G. We show the correctness of ASP-G in simulating Boolean network models and obtaining attractors under different assumptions by successfully recapitulating the detection of attractors of previously published studies. We also provide an example of how performing simulation of network models under different settings help determine the assumptions under which a certain conclusion holds. The main added value of ASP-G is in its modularity and declarativity, making it more flexible and less error-prone than traditional approaches. The declarative nature of ASP-G comes at the expense of being slower than the more dedicated systems but still achieves a good efficiency with respect to computational time. Availability and implementation: The source code of ASP-G is available at http://bioinformatics.intec.ugent.be/kmarchal/Supplementary_Information_Musthofa_2014/asp-g.zip. Contact: Kathleen.Marchal@UGent.be or Martine.DeCock@UGent.be Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25028722

  14. Enhanced conformational sampling using enveloping distribution sampling.

    PubMed

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, S.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  16. Science & Technology Review September/October 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bearinger, J P

    2008-07-21

    This issue has the following articles: (1) Answering Scientists Most Audacious Questions--Commentary by Dona Crawford; (2) Testing the Accuracy of the Supernova Yardstick--High-resolution simulations are advancing understanding of Type Ia supernovae to help uncover the mysteries of dark energy; (3) Developing New Drugs and Personalized Medical Treatment--Accelerator mass spectrometry is emerging as an essential tool for assessing the effects of drugs in humans; (4) Triage in a Patch--A painless skin patch and accompanying detector can quickly indicate human exposure to biological pathogens, chemicals, explosives, or radiation; and (5) Smoothing Out Defects for Extreme Ultraviolet Lithography--A process for smoothing mask defectsmore » helps move extreme ultraviolet lithography one step closer to creating smaller, more powerful computer chips.« less

  17. Dimensional synthesis of a leg mechanism

    NASA Astrophysics Data System (ADS)

    Pop, F.; Lovasz, E.-Ch; Pop, C.; Dolga, V.

    2016-08-01

    An eight bar leg mechanism dimensional synthesis is presented. The mathematical model regarding the synthesis is described and the results obtained after computation are verified with help of 2D mechanism simulation in Matlab. This mechanism, inspired from proposed solution of Theo Jansen, is integrated into the structure of a 2 DOF quadruped robot. With help of the kinematic synthesis method described, it is tried to determine new dimensions for the mechanism, based on a set of initial conditions. These are established by taking into account the movement of the end point of the leg mechanism, which enters in contact with the ground, during walking. An optimization process based on the results obtained can be conducted further in order to find a better solution for the leg mechanism.

  18. Software for Secondary-School Learning About Robotics

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Truong, Dat; Hodgson, Terry R.

    2005-01-01

    The ROVer Ranch is an interactive computer program designed to help secondary-school students learn about space-program robotics and related basic scientific concepts by involving the students in simplified design and programming tasks that exercise skills in mathematics and science. The tasks involve building simulated robots and then observing how they behave. The program furnishes (1) programming tools that a student can use to assemble and program a simulated robot and (2) a virtual three-dimensional mission simulator for testing the robot. First, the ROVer Ranch presents fundamental information about robotics, mission goals, and facts about the mission environment. On the basis of this information, and using the aforementioned tools, the student assembles a robot by selecting parts from such subsystems as propulsion, navigation, and scientific tools, the student builds a simulated robot to accomplish its mission. Once the robot is built, it is programmed and then placed in a three-dimensional simulated environment. Success or failure in the simulation depends on the planning and design of the robot. Data and results of the mission are available in a summary log once the mission is concluded.

  19. Low-Visibility Visual Simulation with Real Fog

    NASA Technical Reports Server (NTRS)

    Chase, Wendell D.

    1982-01-01

    An environmental fog simulation (EFS) attachment was developed to aid in the study of natural low-visibility visual cues and subsequently used to examine the realism effect upon the aircraft simulator visual scene. A review of the basic fog equations indicated that the two major factors must be accounted for in the simulation of low visibility-one due to atmospheric attenuation and one due to veiling luminance. These factors are compared systematically by: comparing actual measurements lo those computed from the Fog equations, and comparing runway-visual-range-related visual-scene contrast values with the calculated values. These values are also compared with the simulated equivalent equations and with contrast measurements obtained from a current electronic fog synthesizer to help identify areas in which improvements are needed. These differences in technique, the measured values, the Features of both systems, a pilot opinion survey of the EFS fog, and improvements (by combining features of both systems) that are expected to significantly increase the potential as well as flexibility for producing a very high-fidelity, low-visibility visual simulation are discussed.

  20. Low-visibility visual simulation with real fog

    NASA Technical Reports Server (NTRS)

    Chase, W. D.

    1981-01-01

    An environmental fog simulation (EFS) attachment was developed to aid in the study of natural low-visibility visual cues and subsequently used to examine the realism effect upon the aircraft simulator visual scene. A review of the basic fog equations indicated that two major factors must be accounted for in the simulation of low visibility - one due to atmospheric attenuation and one due to veiling luminance. These factors are compared systematically by (1) comparing actual measurements to those computed from the fog equations, and (2) comparing runway-visual-range-related visual-scene contrast values with the calculated values. These values are also compared with the simulated equivalent equations and with contrast measurements obtained from a current electronic fog synthesizer to help identify areas in which improvements are needed. These differences in technique, the measured values, the features of both systems, a pilot opinion survey of the EFS fog, and improvements (by combining features of both systems) that are expected to significantly increase the potential as well as flexibility for producing a very high-fidelity low-visibility visual simulation are discussed.

  1. Strategies for Large Scale Implementation of a Multiscale, Multiprocess Integrated Hydrologic Model

    NASA Astrophysics Data System (ADS)

    Kumar, M.; Duffy, C.

    2006-05-01

    Distributed models simulate hydrologic state variables in space and time while taking into account the heterogeneities in terrain, surface, subsurface properties and meteorological forcings. Computational cost and complexity associated with these model increases with its tendency to accurately simulate the large number of interacting physical processes at fine spatio-temporal resolution in a large basin. A hydrologic model run on a coarse spatial discretization of the watershed with limited number of physical processes needs lesser computational load. But this negatively affects the accuracy of model results and restricts physical realization of the problem. So it is imperative to have an integrated modeling strategy (a) which can be universally applied at various scales in order to study the tradeoffs between computational complexity (determined by spatio- temporal resolution), accuracy and predictive uncertainty in relation to various approximations of physical processes (b) which can be applied at adaptively different spatial scales in the same domain by taking into account the local heterogeneity of topography and hydrogeologic variables c) which is flexible enough to incorporate different number and approximation of process equations depending on model purpose and computational constraint. An efficient implementation of this strategy becomes all the more important for Great Salt Lake river basin which is relatively large (~89000 sq. km) and complex in terms of hydrologic and geomorphic conditions. Also the types and the time scales of hydrologic processes which are dominant in different parts of basin are different. Part of snow melt runoff generated in the Uinta Mountains infiltrates and contributes as base flow to the Great Salt Lake over a time scale of decades to centuries. The adaptive strategy helps capture the steep topographic and climatic gradient along the Wasatch front. Here we present the aforesaid modeling strategy along with an associated hydrologic modeling framework which facilitates a seamless, computationally efficient and accurate integration of the process model with the data model. The flexibility of this framework leads to implementation of multiscale, multiresolution, adaptive refinement/de-refinement and nested modeling simulations with least computational burden. However, performing these simulations and related calibration of these models over a large basin at higher spatio- temporal resolutions is computationally intensive and requires use of increasing computing power. With the advent of parallel processing architectures, high computing performance can be achieved by parallelization of existing serial integrated-hydrologic-model code. This translates to running the same model simulation on a network of large number of processors thereby reducing the time needed to obtain solution. The paper also discusses the implementation of the integrated model on parallel processors. Also will be discussed the mapping of the problem on multi-processor environment, method to incorporate coupling between hydrologic processes using interprocessor communication models, model data structure and parallel numerical algorithms to obtain high performance.

  2. Real-time global MHD simulation of the solar wind interaction with the earth’s magnetosphere

    NASA Astrophysics Data System (ADS)

    Shimazu, H.; Kitamura, K.; Tanaka, T.; Fujita, S.; Nakamura, M. S.; Obara, T.

    2008-11-01

    We have developed a real-time global MHD (magnetohydrodynamics) simulation of the solar wind interaction with the earth’s magnetosphere. By adopting the real-time solar wind parameters and interplanetary magnetic field (IMF) observed routinely by the ACE (Advanced Composition Explorer) spacecraft, responses of the magnetosphere are calculated with MHD code. The simulation is carried out routinely on the super computer system at National Institute of Information and Communications Technology (NICT), Japan. The visualized images of the magnetic field lines around the earth, pressure distribution on the meridian plane, and the conductivity of the polar ionosphere, can be referred to on the web site (http://www2.nict.go.jp/y/y223/simulation/realtime/). The results show that various magnetospheric activities are almost reproduced qualitatively. They also give us information how geomagnetic disturbances develop in the magnetosphere in relation with the ionosphere. From the viewpoint of space weather, the real-time simulation helps us to understand the whole image in the current condition of the magnetosphere. To evaluate the simulation results, we compare the AE indices derived from the simulation and observations. The simulation and observation agree well for quiet days and isolated substorm cases in general.

  3. Computer models for economic and silvicultural decisions

    Treesearch

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  4. Computer-based simulation training to improve learning outcomes in mannequin-based simulation exercises.

    PubMed

    Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J

    2011-08-10

    To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.

  5. A distributed, dynamic, parallel computational model: the role of noise in velocity storage

    PubMed Central

    Merfeld, Daniel M.

    2012-01-01

    Networks of neurons perform complex calculations using distributed, parallel computation, including dynamic “real-time” calculations required for motion control. The brain must combine sensory signals to estimate the motion of body parts using imperfect information from noisy neurons. Models and experiments suggest that the brain sometimes optimally minimizes the influence of noise, although it remains unclear when and precisely how neurons perform such optimal computations. To investigate, we created a model of velocity storage based on a relatively new technique–“particle filtering”–that is both distributed and parallel. It extends existing observer and Kalman filter models of vestibular processing by simulating the observer model many times in parallel with noise added. During simulation, the variance of the particles defining the estimator state is used to compute the particle filter gain. We applied our model to estimate one-dimensional angular velocity during yaw rotation, which yielded estimates for the velocity storage time constant, afferent noise, and perceptual noise that matched experimental data. We also found that the velocity storage time constant was Bayesian optimal by comparing the estimate of our particle filter with the estimate of the Kalman filter, which is optimal. The particle filter demonstrated a reduced velocity storage time constant when afferent noise increased, which mimics what is known about aminoglycoside ablation of semicircular canal hair cells. This model helps bridge the gap between parallel distributed neural computation and systems-level behavioral responses like the vestibuloocular response and perception. PMID:22514288

  6. Computational analysis of nonlinearities within dynamics of cable-based driving systems

    NASA Astrophysics Data System (ADS)

    Anghelache, G. D.; Nastac, S.

    2017-08-01

    This paper deals with computational nonlinear dynamics of mechanical systems containing some flexural parts within the actuating scheme, and, especially, the situations of the cable-based driving systems were treated. It was supposed both functional nonlinearities and the real characteristic of the power supply, in order to obtain a realistically computer simulation model being able to provide very feasible results regarding the system dynamics. It was taken into account the transitory and stable regimes during a regular exploitation cycle. The authors present a particular case of a lift system, supposed to be representatively for the objective of this study. The simulations were made based on the values of the essential parameters acquired from the experimental tests and/or the regular practice in the field. The results analysis and the final discussions reveal the correlated dynamic aspects within the mechanical parts, the driving system, and the power supply, whole of these supplying potential sources of particular resonances, within some transitory phases of the working cycle, and which can affect structural and functional dynamics. In addition, it was underlines the influences of computational hypotheses on the both quantitative and qualitative behaviour of the system. Obviously, the most significant consequence of this theoretical and computational research consist by developing an unitary and feasible model, useful to dignify the nonlinear dynamic effects into the systems with cable-based driving scheme, and hereby to help an optimization of the exploitation regime including a dynamics control measures.

  7. Simulation and phases of macroscopic particles in vortex flow

    NASA Astrophysics Data System (ADS)

    Rice, Heath Eric

    Granular materials are an interesting class of media in that they exhibit many disparate characteristics depending on conditions. The same set of particles may behave like a solid, liquid, gas, something in-between, or something completely unique depending on the conditions. Practically speaking, granular materials are used in many aspects of manufacturing, therefore any new information gleaned about them may help refine these techniques. For example, learning of a possible instability may help avoid it in practical application, saving machinery, money, and even personnel. To that end, we intend to simulate a granular medium under tornado-like vortex airflow by varying particle parameters and observing the behaviors that arise. The simulation itself was written in Python from the ground up, starting from the basic simulation equations in Poschel [1]. From there, particle spin, viscous friction, and vertical and tangential airflow were added. The simulations were then run in batches on a local cluster computer, varying the parameters of radius, flow force, density, and friction. Phase plots were created after observing the behaviors of the simulations and the regions and borders were analyzed. Most of the results were as expected: smaller particles behaved more like a gas, larger particles behaved more like a solid, and most intermediate simulations behaved like a liquid. A small subset formed an interesting crossover region in the center, and under moderate forces began to throw a few particles at a time upward from the center in a fountain-like effect. Most borders between regions appeared to agree with analysis, following a parabolic critical rotational velocity at which the parabolic surface of the material dips to the bottom of the mass of particles. The fountain effects seemed to occur at speeds along and slightly faster than this division. [1] Please see thesis for references.

  8. Simulation of Spiral Waves and Point Sources in Atrial Fibrillation with Application to Rotor Localization

    PubMed Central

    Ganesan, Prasanth; Shillieto, Kristina E.; Ghoraani, Behnaz

    2018-01-01

    Cardiac simulations play an important role in studies involving understanding and investigating the mechanisms of cardiac arrhythmias. Today, studies of arrhythmogenesis and maintenance are largely being performed by creating simulations of a particular arrhythmia with high accuracy comparable to the results of clinical experiments. Atrial fibrillation (AF), the most common arrhythmia in the United States and many other parts of the world, is one of the major field where simulation and modeling is largely used. AF simulations not only assist in understanding its mechanisms but also help to develop, evaluate and improve the computer algorithms used in electrophysiology (EP) systems for ablation therapies. In this paper, we begin with a brief overeview of some common techniques used in simulations to simulate two major AF mechanisms – spiral waves (or rotors) and point (or focal) sources. We particularly focus on 2D simulations using Nygren et al.’s mathematical model of human atrial cell. Then, we elucidate an application of the developed AF simulation to an algorithm designed for localizing AF rotors for improving current AF ablation therapies. Our simulation methods and results, along with the other discussions presented in this paper is aimed to provide engineers and professionals with a working-knowledge of application-specific simulations of spirals and foci. PMID:29629398

  9. 3-D Analysis of Flanged Joints Through Various Preload Methods Using ANSYS

    NASA Astrophysics Data System (ADS)

    Murugan, Jeyaraj Paul; Kurian, Thomas; Jayaprakash, Janardhan; Sreedharapanickar, Somanath

    2015-10-01

    Flanged joints are being employed in aerospace solid rocket motor hardware for the integration of various systems or subsystems. Hence, the design of flanged joints is very important in ensuring the integrity of motor while functioning. As these joints are subjected to higher loads due to internal pressure acting inside the motor chamber, an appropriate preload is required to be applied in this joint before subjecting it to the external load. Preload, also known as clamp load, is applied on the fastener and helps to hold the mating flanges together. Generally preload is simulated as a thermal load and the exact preload is obtained through number of iterations. Infact, more iterations are required when considering the material nonlinearity of the bolt. This way of simulation will take more computational time for generating the required preload. Now a days most commercial software packages use pretension elements for simulating the preload. This element does not require iterations for inducing the preload and it can be solved with single iteration. This approach takes less computational time and thus one can study the characteristics of the joint easily by varying the preload. When the structure contains more number of joints with different sizes of fasteners, pretension elements can be used compared to thermal load approach for simulating each size of fastener. This paper covers the details of analyses carried out simulating the preload through various options viz., preload through thermal, initial state command and pretension element etc. using ANSYS finite element package.

  10. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  11. Continuum Approaches to Understanding Ion and Peptide Interactions with the Membrane

    PubMed Central

    Latorraca, Naomi R.; Callenberg, Keith M.; Boyle, Jon P.; Grabe, Michael

    2014-01-01

    Experimental and computational studies have shown that cellular membranes deform to stabilize the inclusion of transmembrane (TM) proteins harboring charge. Recent analysis suggests that membrane bending helps to expose charged and polar residues to the aqueous environment and polar head groups. We previously used elasticity theory to identify membrane distortions that minimize the insertion of charged TM peptides into the membrane. Here, we extend our work by showing that it also provides a novel, computationally efficient method for exploring the energetics of ion and small peptide penetration into membranes. First, we show that the continuum method accurately reproduces energy profiles and membrane shapes generated from molecular simulations of bare ion permeation at a fraction of the computational cost. Next, we demonstrate that the dependence of the ion insertion energy on the membrane thickness arises primarily from the elastic properties of the membrane. Moreover, the continuum model readily provides a free energy decomposition into components not easily determined from molecular dynamics. Finally, we show that the energetics of membrane deformation strongly depend on membrane patch size both for ions and peptides. This dependence is particularly strong for peptides based on simulations of a known amphipathic, membrane binding peptide from the human pathogen Toxoplasma gondii. In total, we address shortcomings and advantages that arise from using a variety of computational methods in distinct biological contexts. PMID:24652510

  12. Development of a radial ventricular assist device using numerical predictions and experimental haemolysis.

    PubMed

    Carswell, Dave; Hilton, Andy; Chan, Chris; McBride, Diane; Croft, Nick; Slone, Avril; Cross, Mark; Foster, Graham

    2013-08-01

    The objective of this study was to demonstrate the potential of Computational Fluid Dynamics (CFD) simulations in predicting the levels of haemolysis in ventricular assist devices (VADs). Three different prototypes of a radial flow VAD have been examined experimentally and computationally using CFD modelling to assess device haemolysis. Numerical computations of the flow field were computed using a CFD model developed with the use of the commercial software Ansys CFX 13 and a set of custom haemolysis analysis tools. Experimental values for the Normalised Index of Haemolysis (NIH) have been calculated as 0.020 g/100 L, 0.014 g/100 L and 0.0042 g/100 L for the three designs. Numerical analysis predicts an NIH of 0.021 g/100 L, 0.017 g/100 L and 0.0057 g/100 L, respectively. The actual differences between experimental and numerical results vary between 0.0012 and 0.003 g/100 L, with a variation of 5% for Pump 1 and slightly larger percentage differences for the other pumps. The work detailed herein demonstrates how CFD simulation and, more importantly, the numerical prediction of haemolysis may be used as an effective tool in order to help the designers of VADs manage the flow paths within pumps resulting in a less haemolytic device. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Neurosurgical tactile discrimination training with haptic-based virtual reality simulation.

    PubMed

    Patel, Achal; Koshy, Nick; Ortega-Barnett, Juan; Chan, Hoi C; Kuo, Yong-Fan; Luciano, Cristian; Rizzi, Silvio; Matulyauskas, Martin; Kania, Patrick; Banerjee, Pat; Gasco, Jaime

    2014-12-01

    To determine if a computer-based simulation with haptic technology can help surgical trainees improve tactile discrimination using surgical instruments. Twenty junior medical students participated in the study and were randomized into two groups. Subjects in Group A participated in virtual simulation training using the ImmersiveTouch simulator (ImmersiveTouch, Inc., Chicago, IL, USA) that required differentiating the firmness of virtual spheres using tactile and kinesthetic sensation via haptic technology. Subjects in Group B did not undergo any training. With their visual fields obscured, subjects in both groups were then evaluated on their ability to use the suction and bipolar instruments to find six elastothane objects with areas ranging from 1.5 to 3.5 cm2 embedded in a urethane foam brain cavity model while relying on tactile and kinesthetic sensation only. A total of 73.3% of the subjects in Group A (simulation training) were able to find the brain cavity objects in comparison to 53.3% of the subjects in Group B (no training) (P  =  0.0183). There was a statistically significant difference in the total number of Group A subjects able to find smaller brain cavity objects (size ≤ 2.5 cm2) compared to that in Group B (72.5 vs. 40%, P  =  0.0032). On the other hand, no significant difference in the number of subjects able to detect larger objects (size ≧ 3 cm2) was found between Groups A and B (75 vs. 80%, P  =  0.7747). Virtual computer-based simulators with integrated haptic technology may improve tactile discrimination required for microsurgical technique.

  14. Density Functional Computations and Molecular Dynamics Simulations of the Triethylammonium Triflate Protic Ionic Liquid.

    PubMed

    Mora Cardozo, Juan F; Burankova, T; Embs, J P; Benedetto, A; Ballone, P

    2017-12-21

    Systematic molecular dynamics simulations based on an empirical force field have been carried out for samples of triethylammonium trifluoromethanesulfonate (triethylammonium triflate, [TEA][Tf]), covering a wide temperature range 200 K ≤ T ≤ 400 K and analyzing a broad set of properties, from self-diffusion and electrical conductivity to rotational relaxation and hydrogen-bond dynamics. The study is motivated by recent quasi-elastic neutron scattering and differential scanning calorimetry measurements on the same system, revealing two successive first order transitions at T ≈ 230 and 310 K (on heating), as well as an intriguing and partly unexplained variety of subdiffusive motions of the acidic proton. Simulations show a weakly discontinuous transition at T = 310 K and highlight an anomaly at T = 260 K in the rotational relaxation of ions that we identify with the simulation analogue of the experimental transition at T = 230 K. Thus, simulations help identifying the nature of the experimental transitions, confirming that the highest temperature one corresponds to melting, while the one taking place at lower T is a transition from the crystal, stable at T ≤ 260 K, to a plastic phase (260 ≤ T ≤ 310 K), in which molecules are able to rotate without diffusing. Rotations, in particular, account for the subdiffusive motion seen at intermediate T both in the experiments and in the simulation. The structure, distribution, and strength of hydrogen bonds are investigated by molecular dynamics and by density functional computations. Clustering of ions of the same sign and the effect of contamination by water at 1% wgt concentration are discussed as well.

  15. Concepts and algorithms for terminal-area traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, H.; Chapel, J. D.

    1984-01-01

    The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.

  16. Hydrologic modeling strategy for the Islamic Republic of Mauritania, Africa

    USGS Publications Warehouse

    Friedel, Michael J.

    2008-01-01

    The government of Mauritania is interested in how to maintain hydrologic balance to ensure a long-term stable water supply for minerals-related, domestic, and other purposes. Because of the many complicating and competing natural and anthropogenic factors, hydrologists will perform quantitative analysis with specific objectives and relevant computer models in mind. Whereas various computer models are available for studying water-resource priorities, the success of these models to provide reliable predictions largely depends on adequacy of the model-calibration process. Predictive analysis helps us evaluate the accuracy and uncertainty associated with simulated dependent variables of our calibrated model. In this report, the hydrologic modeling process is reviewed and a strategy summarized for future Mauritanian hydrologic modeling studies.

  17. Evaluation of Content-Matched Range Monitoring Queries over Moving Objects in Mobile Computing Environments.

    PubMed

    Jung, HaRim; Song, MoonBae; Youn, Hee Yong; Kim, Ung Mo

    2015-09-18

    A content-matched (CM) rangemonitoring query overmoving objects continually retrieves the moving objects (i) whose non-spatial attribute values are matched to given non-spatial query values; and (ii) that are currently located within a given spatial query range. In this paper, we propose a new query indexing structure, called the group-aware query region tree (GQR-tree) for efficient evaluation of CMrange monitoring queries. The primary role of the GQR-tree is to help the server leverage the computational capabilities of moving objects in order to improve the system performance in terms of the wireless communication cost and server workload. Through a series of comprehensive simulations, we verify the superiority of the GQR-tree method over the existing methods.

  18. Color reproducibility and dyestuff concentration

    NASA Astrophysics Data System (ADS)

    Csanyi, Sandor

    2002-06-01

    The purpose of this study was to develop a new sensitivity index connected with color matching, which makes it possible to investigate the effects of dyestuff concentration deviations in a larger part of the color space in a comprehensive manner. By the help of computer simulation and experimental design, we examined the color differences resulting from minor concentration changes in approximately 500 formulas of different compositions, altering their total concentration and the proportion of the individual dyes in them. The new sensitivity index makes it possible for the colorist to select the recipe that is the least sensitive to concentration deviations from among the computer color formulas, as well as to add a new aspect to the ranking applied in color matching so far.

  19. Informatics and physics intersubject communications in the 7th and 8th grades of the basics level by means of computer modeling

    NASA Astrophysics Data System (ADS)

    Vasina, A. V.

    2017-01-01

    The author of the article imparts pedagogical experience of realization of intersubject communications of school basic courses of informatics, technology and physics through research activity of students with the use of specialized programs for the development and studying of computer models of physical processes. The considered technique is based on the principles of independent scholar activity of students, intersubject communications such as educational disciplines of technology, physics and informatics; it helps to develop the research activity of students and a professional and practical orientation of education. As an example the lesson of modeling of flotation with the use of the environment "1C Physical simulator" is considered.

  20. A stellar audit: the computation of encounter rates for 47 Tucanae and omega Centauri

    NASA Astrophysics Data System (ADS)

    Davies, Melvyn B.; Benz, Willy

    1995-10-01

    Using King-Mitchie models, we compute encounter rates between the various stellar species in the globular clusters omega Cen and 47 Tuc. We also compute event rates for encounters between single stars and a population of primordial binaries. Using these rates, and what we have learnt from hydrodynamical simulations of encounters performed earlier, we compute the production rates of objects such as low-mass X-ray binaries (LMXBs), smothered neutron stars and blue stragglers (massive main-sequence stars). If 10 per cent of the stars are contained in primordial binaries, the production rate of interesting objects from encounters involving these binaries is as large as that from encounters between single stars. For example, encounters involving binaries produce a significant number of blue stragglers in both globular cluster models. The number of smothered neutron stars may exceed the number of LMXBs by a factor of 5-20, which may help to explain why millisecond pulsars are observed to outnumber LMXBs in globular clusters.

  1. Statistical methods and computing for big data.

    PubMed

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.

  2. Knowledge acquisition, semantic text mining, and security risks in health and biomedical informatics

    PubMed Central

    Huang, Jingshan; Dou, Dejing; Dang, Jiangbo; Pardue, J Harold; Qin, Xiao; Huan, Jun; Gerthoffer, William T; Tan, Ming

    2012-01-01

    Computational techniques have been adopted in medical and biological systems for a long time. There is no doubt that the development and application of computational methods will render great help in better understanding biomedical and biological functions. Large amounts of datasets have been produced by biomedical and biological experiments and simulations. In order for researchers to gain knowledge from original data, nontrivial transformation is necessary, which is regarded as a critical link in the chain of knowledge acquisition, sharing, and reuse. Challenges that have been encountered include: how to efficiently and effectively represent human knowledge in formal computing models, how to take advantage of semantic text mining techniques rather than traditional syntactic text mining, and how to handle security issues during the knowledge sharing and reuse. This paper summarizes the state-of-the-art in these research directions. We aim to provide readers with an introduction of major computing themes to be applied to the medical and biological research. PMID:22371823

  3. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  4. NON-EQUILIBRIUM HELIUM IONIZATION IN AN MHD SIMULATION OF THE SOLAR ATMOSPHERE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golding, Thomas Peter; Carlsson, Mats; Leenaarts, Jorrit, E-mail: thomas.golding@astro.uio.no, E-mail: mats.carlsson@astro.uio.no, E-mail: jorrit.leenaarts@astro.su.se

    The ionization state of the gas in the dynamic solar chromosphere can depart strongly from the instantaneous statistical equilibrium commonly assumed in numerical modeling. We improve on earlier simulations of the solar atmosphere that only included non-equilibrium hydrogen ionization by performing a 2D radiation-magnetohydrodynamics simulation featuring non-equilibrium ionization of both hydrogen and helium. The simulation includes the effect of hydrogen Lyα and the EUV radiation from the corona on the ionization and heating of the atmosphere. Details on code implementation are given. We obtain helium ion fractions that are far from their equilibrium values. Comparison with models with local thermodynamicmore » equilibrium (LTE) ionization shows that non-equilibrium helium ionization leads to higher temperatures in wavefronts and lower temperatures in the gas between shocks. Assuming LTE ionization results in a thermostat-like behavior with matter accumulating around the temperatures where the LTE ionization fractions change rapidly. Comparison of DEM curves computed from our models shows that non-equilibrium ionization leads to more radiating material in the temperature range 11–18 kK, compared to models with LTE helium ionization. We conclude that non-equilibrium helium ionization is important for the dynamics and thermal structure of the upper chromosphere and transition region. It might also help resolve the problem that intensities of chromospheric lines computed from current models are smaller than those observed.« less

  5. Consumer preference for seeds and seedlings of rare species impacts tree diversity at multiple scales.

    PubMed

    Young, Hillary S; McCauley, Douglas J; Guevara, Roger; Dirzo, Rodolfo

    2013-07-01

    Positive density-dependent seed and seedling predation, where herbivores selectively eat seeds or seedlings of common species, is thought to play a major role in creating and maintaining plant community diversity. However, many herbivores and seed predators are known to exhibit preferences for rare foods, which could lead to negative density-dependent predation. In this study, we first demonstrate the occurrence of increased predation of locally rare tree species by a widespread group of insular seed and seedling predators, land crabs. We then build computer simulations based on these empirical data to examine the effects of such predation on diversity patterns. Simulations show that herbivore preferences for locally rare species are likely to drive scale-dependent effects on plant community diversity: at small scales these foraging patterns decrease plant community diversity via the selective consumption of rare plant species, while at the landscape level they should increase diversity, at least for short periods, by promoting clustered local dominance of a variety of species. Finally, we compared observed patterns of plant diversity at the site to those obtained via computer simulations, and found that diversity patterns generated under simulations were highly consistent with observed diversity patterns. We posit that preference for rare species by herbivores may be prevalent in low- or moderate-diversity systems, and that these effects may help explain diversity patterns across different spatial scales in such ecosystems.

  6. New tendencies in wildland fire simulation for understanding fire phenomena: An overview of the WFDS system capabilities in Mediterranean ecosystems

    NASA Astrophysics Data System (ADS)

    Pastor, E.; Tarragó, D.; Planas, E.

    2012-04-01

    Wildfire theoretical modeling endeavors predicting fire behavior characteristics, such as the rate of spread, the flames geometry and the energy released by the fire front by applying the physics and the chemistry laws that govern fire phenomena. Its ultimate aim is to help fire managers to improve fire prevention and suppression and hence reducing damage to population and protecting ecosystems. WFDS is a 3D computational fluid dynamics (CFD) model of a fire-driven flow. It is particularly appropriate for predicting the fire behaviour burning through the wildland-urban interface, since it is able to predict the fire behaviour in the intermix of vegetative and structural fuels that comprise the wildland urban interface. This model is not suitable for operational fire management yet due to computational costs constrains, but given the fact that it is open-source and that it has a detailed description of the fuels and of the combustion and heat transfer mechanisms it is currently a suitable system for research purposes. In this paper we present the most important characteristics of the WFDS simulation tool in terms of the models implemented, the input information required and the outputs that the simulator gives useful for understanding fire phenomena. We briefly discuss its advantages and opportunities through some simulation exercises of Mediterranean ecosystems.

  7. VIBA-Lab 3.0: Computer program for simulation and semi-quantitative analysis of PIXE and RBS spectra and 2D elemental maps

    NASA Astrophysics Data System (ADS)

    Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana

    2015-11-01

    VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.

  8. Communication: From close-packed to topologically close-packed: Formation of Laves phases in moderately polydisperse hard-sphere mixtures

    NASA Astrophysics Data System (ADS)

    Lindquist, Beth A.; Jadrich, Ryan B.; Truskett, Thomas M.

    2018-05-01

    Particle size polydispersity can help to inhibit crystallization of the hard-sphere fluid into close-packed structures at high packing fractions and thus is often employed to create model glass-forming systems. Nonetheless, it is known that hard-sphere mixtures with modest polydispersity still have ordered ground states. Here, we demonstrate by computer simulation that hard-sphere mixtures with increased polydispersity fractionate on the basis of particle size and a bimodal subpopulation favors the formation of topologically close-packed C14 and C15 Laves phases in coexistence with a disordered phase. The generality of this result is supported by simulations of hard-sphere mixtures with particle-size distributions of four different forms.

  9. Free energy calculation of permeant-membrane interactions using molecular dynamics simulations.

    PubMed

    Elvati, Paolo; Violi, Angela

    2012-01-01

    Nanotoxicology, the science concerned with the safe use of nanotechnology and nanostructure design for biological applications, is a field of research that has recently received great attention, as a result of the rapid growth in nanotechnology. Many nanostructures are of a scale and chemical composition similar to many biomolecular environments, and recent papers have reported evident toxicity of selected nanoparticles. Molecular simulations can help develop a mechanistic understanding of how structural properties affect bioactivity. In this chapter, we describe how to compute the free energy of interactions between cellular membranes and benzene, the main constituent of some toxic carbonaceous particles, with well-tempered metadynamics. This algorithm reconstructs the free energy surface and accelerates rare events in a coarse-grained representation of the system.

  10. Exploring the role of 3-dimensional simulation in surgical training: feedback from a pilot study.

    PubMed

    Podolsky, Dale J; Martin, Allan R; Whyne, Cari M; Massicotte, Eric M; Hardisty, Michael R; Ginsberg, Howard J

    2010-12-01

    Randomized control study assessing the efficacy of a pedicle screw insertion simulator. To evaluate the efficacy of an in-house developed 3-dimensional software simulation tool for teaching pedicle screw insertion, to gather feedback about the utility of the simulator, and to help identify the context and role such simulation has in surgical education. Traditional instruction for pedicle screw insertion technique consists of didactic teaching and limited hands-on training on artificial or cadaveric models before guided supervision within the operating room. Three-dimensional computer simulation can provide a valuable tool for practicing challenging surgical procedures; however, its potential lies in its effective integration into student learning. Surgical residents were recruited from 2 sequential years of a spine surgery course. Patient and control groups both received standard training on pedicle screw insertion. The patient group received an additional 1-hour session of training on the simulator using a CT-based 3-dimensional model of their assigned cadaver's spine. Qualitative feedback about the simulator was gathered from the trainees, fellows, and staff surgeons, and all pedicles screws physically inserted into the cadavers during the courses were evaluated through CT. A total of 185 thoracic and lumbar pedicle screws were inserted by 37 trainees. Eighty-two percent of the 28 trainees who responded to the questionnaire and all fellows and staff surgeons felt the simulator to be a beneficial educational tool. However, the 1-hour training session did not yield improved performance in screw placement. A 3-dimensional computer-based simulation for pedicle screw insertion was integrated into a cadaveric spine surgery instructional course. Overall, the tool was positively regarded by the trainees, fellows, and staff surgeons. However, the limited training with the simulator did not translate into widespread comfort with its operation or into improvement in physical screw placement.

  11. LDPC decoder with a limited-precision FPGA-based floating-point multiplication coprocessor

    NASA Astrophysics Data System (ADS)

    Moberly, Raymond; O'Sullivan, Michael; Waheed, Khurram

    2007-09-01

    Implementing the sum-product algorithm, in an FPGA with an embedded processor, invites us to consider a tradeoff between computational precision and computational speed. The algorithm, known outside of the signal processing community as Pearl's belief propagation, is used for iterative soft-decision decoding of LDPC codes. We determined the feasibility of a coprocessor that will perform product computations. Our FPGA-based coprocessor (design) performs computer algebra with significantly less precision than the standard (e.g. integer, floating-point) operations of general purpose processors. Using synthesis, targeting a 3,168 LUT Xilinx FPGA, we show that key components of a decoder are feasible and that the full single-precision decoder could be constructed using a larger part. Soft-decision decoding by the iterative belief propagation algorithm is impacted both positively and negatively by a reduction in the precision of the computation. Reducing precision reduces the coding gain, but the limited-precision computation can operate faster. A proposed solution offers custom logic to perform computations with less precision, yet uses the floating-point format to interface with the software. Simulation results show the achievable coding gain. Synthesis results help theorize the the full capacity and performance of an FPGA-based coprocessor.

  12. Modeling the Cerebellar Microcircuit: New Strategies for a Long-Standing Issue.

    PubMed

    D'Angelo, Egidio; Antonietti, Alberto; Casali, Stefano; Casellato, Claudia; Garrido, Jesus A; Luque, Niceto Rafael; Mapelli, Lisa; Masoli, Stefano; Pedrocchi, Alessandra; Prestori, Francesca; Rizza, Martina Francesca; Ros, Eduardo

    2016-01-01

    The cerebellar microcircuit has been the work bench for theoretical and computational modeling since the beginning of neuroscientific research. The regular neural architecture of the cerebellum inspired different solutions to the long-standing issue of how its circuitry could control motor learning and coordination. Originally, the cerebellar network was modeled using a statistical-topological approach that was later extended by considering the geometrical organization of local microcircuits. However, with the advancement in anatomical and physiological investigations, new discoveries have revealed an unexpected richness of connections, neuronal dynamics and plasticity, calling for a change in modeling strategies, so as to include the multitude of elementary aspects of the network into an integrated and easily updatable computational framework. Recently, biophysically accurate "realistic" models using a bottom-up strategy accounted for both detailed connectivity and neuronal non-linear membrane dynamics. In this perspective review, we will consider the state of the art and discuss how these initial efforts could be further improved. Moreover, we will consider how embodied neurorobotic models including spiking cerebellar networks could help explaining the role and interplay of distributed forms of plasticity. We envisage that realistic modeling, combined with closed-loop simulations, will help to capture the essence of cerebellar computations and could eventually be applied to neurological diseases and neurorobotic control systems.

  13. Modeling the Cerebellar Microcircuit: New Strategies for a Long-Standing Issue

    PubMed Central

    D’Angelo, Egidio; Antonietti, Alberto; Casali, Stefano; Casellato, Claudia; Garrido, Jesus A.; Luque, Niceto Rafael; Mapelli, Lisa; Masoli, Stefano; Pedrocchi, Alessandra; Prestori, Francesca; Rizza, Martina Francesca; Ros, Eduardo

    2016-01-01

    The cerebellar microcircuit has been the work bench for theoretical and computational modeling since the beginning of neuroscientific research. The regular neural architecture of the cerebellum inspired different solutions to the long-standing issue of how its circuitry could control motor learning and coordination. Originally, the cerebellar network was modeled using a statistical-topological approach that was later extended by considering the geometrical organization of local microcircuits. However, with the advancement in anatomical and physiological investigations, new discoveries have revealed an unexpected richness of connections, neuronal dynamics and plasticity, calling for a change in modeling strategies, so as to include the multitude of elementary aspects of the network into an integrated and easily updatable computational framework. Recently, biophysically accurate “realistic” models using a bottom-up strategy accounted for both detailed connectivity and neuronal non-linear membrane dynamics. In this perspective review, we will consider the state of the art and discuss how these initial efforts could be further improved. Moreover, we will consider how embodied neurorobotic models including spiking cerebellar networks could help explaining the role and interplay of distributed forms of plasticity. We envisage that realistic modeling, combined with closed-loop simulations, will help to capture the essence of cerebellar computations and could eventually be applied to neurological diseases and neurorobotic control systems. PMID:27458345

  14. Application of transient CFD-procedures for S-shape computation in pump-turbines with and without FSI

    NASA Astrophysics Data System (ADS)

    Casartelli, E.; Mangani, L.; Ryan, O.; Schmid, A.

    2016-11-01

    CFD has entered the product development process in hydraulic machines since more than three decades. Beside the actual design process, in which the most appropriate geometry for a certain task is iteratively sought, several steady-state simulations and related analyses are performed with the help of CFD. Basic transient CFD-analysis is becoming more and more routine for rotor-stator interaction assessment, but in general unsteady CFD is still not standard due to the large computational effort. Especially for FSI simulations, where mesh motion is involved, a considerable amount of computational time is necessary for the mesh handling and deformation as well as the related unsteady flow field resolution. Therefore this kind of CFD computations are still unusual and mostly performed during trouble-shooting analysis rather than in the standard development process, i.e. in order to understand what went wrong instead of preventing failure or even better to increase the available knowledge. In this paper the application of an efficient and particularly robust algorithm for fast computations with moving mesh is presented for the analysis of transient effects encountered during highly dynamic procedures in the operation of a pump-turbine, like runaway at fixed GV position and load-rejection with GV motion imposed as one-way FSI. In both cases the computations extend through the S-shape of the machine in the turbine-brake and reverse pump domain, showing that such exotic computations can be perform on a more regular base, even if quite time consuming. Beside the presentation of the procedure and global results, some highlights in the encountered flow-physics are also given.

  15. Best Practices for Crash Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jackson, Karen E.

    2002-01-01

    Aviation safety can be greatly enhanced by the expeditious use of computer simulations of crash impact. Unlike automotive impact testing, which is now routine, experimental crash tests of even small aircraft are expensive and complex due to the high cost of the aircraft and the myriad of crash impact conditions that must be considered. Ultimately, the goal is to utilize full-scale crash simulations of aircraft for design evaluation and certification. The objective of this publication is to describe "best practices" for modeling aircraft impact using explicit nonlinear dynamic finite element codes such as LS-DYNA, DYNA3D, and MSC.Dytran. Although "best practices" is somewhat relative, it is hoped that the authors' experience will help others to avoid some of the common pitfalls in modeling that are not documented in one single publication. In addition, a discussion of experimental data analysis, digital filtering, and test-analysis correlation is provided. Finally, some examples of aircraft crash simulations are described in several appendices following the main report.

  16. Simulation of thermal transpiration flow using a high-order moment method

    NASA Astrophysics Data System (ADS)

    Sheng, Qiang; Tang, Gui-Hua; Gu, Xiao-Jun; Emerson, David R.; Zhang, Yong-Hao

    2014-04-01

    Nonequilibrium thermal transpiration flow is numerically analyzed by an extended thermodynamic approach, a high-order moment method. The captured velocity profiles of temperature-driven flow in a parallel microchannel and in a micro-chamber are compared with available kinetic data or direct simulation Monte Carlo (DSMC) results. The advantages of the high-order moment method are shown as a combination of more accuracy than the Navier-Stokes-Fourier (NSF) equations and less computation cost than the DSMC method. In addition, the high-order moment method is also employed to simulate the thermal transpiration flow in complex geometries in two types of Knudsen pumps. One is based on micro-mechanized channels, where the effect of different wall temperature distributions on thermal transpiration flow is studied. The other relies on porous structures, where the variation of flow rate with a changing porosity or pore surface area ratio is investigated. These simulations can help to optimize the design of a real Knudsen pump.

  17. Using a Radiofrequency Identification System for Improving the Patient Discharge Process: A Simulation Study.

    PubMed

    Shim, Sung J; Kumar, Arun; Jiao, Roger

    2016-01-01

    A hospital is considering deploying a radiofrequency identification (RFID) system and setting up a new "discharge lounge" to improve the patient discharge process. This study uses computer simulation to model and compare the current process and the new process, and it assesses the impact of the RFID system and the discharge lounge on the process in terms of resource utilization and time taken in the process. The simulation results regarding resource utilization suggest that the RFID system can slightly relieve the burden on all resources, whereas the RFID system and the discharge lounge together can significantly mitigate the nurses' tasks. The simulation results in terms of the time taken demonstrate that the RFID system can shorten patient wait times, staff busy times, and bed occupation times. The results of the study could prove helpful to others who are considering the use of an RFID system in the patient discharge process in hospitals or similar processes.

  18. An analysis of intergroup rivalry using Ising model and reinforcement learning

    NASA Astrophysics Data System (ADS)

    Zhao, Feng-Fei; Qin, Zheng; Shao, Zhuo

    2014-01-01

    Modeling of intergroup rivalry can help us better understand economic competitions, political elections and other similar activities. The result of intergroup rivalry depends on the co-evolution of individual behavior within one group and the impact from the rival group. In this paper, we model the rivalry behavior using Ising model. Different from other simulation studies using Ising model, the evolution rules of each individual in our model are not static, but have the ability to learn from historical experience using reinforcement learning technique, which makes the simulation more close to real human behavior. We studied the phase transition in intergroup rivalry and focused on the impact of the degree of social freedom, the personality of group members and the social experience of individuals. The results of computer simulation show that a society with a low degree of social freedom and highly educated, experienced individuals is more likely to be one-sided in intergroup rivalry.

  19. Simulation of ozone production in a complex circulation region using nested grids

    NASA Astrophysics Data System (ADS)

    Taghavi, M.; Cautenet, S.; Foret, G.

    2003-07-01

    During ESCOMPTE precampaign (15 June to 10 July 2000), three days of intensive pollution (IOP0) have been observed and simulated. The comprehensive RAMS model, version 4.3, coupled online with a chemical module including 29 species, has been used to follow the chemistry of the zone polluted over southern France. This online method can be used because the code is paralleled and the SGI 3800 computer is very powerful. Two runs have been performed: run1 with one grid and run2 with two nested grids. The redistribution of simulated chemical species (ozone, carbon monoxide, sulphur dioxide and nitrogen oxides) was compared to aircraft measurements and surface stations. The 2-grid run has given substantially better results than the one-grid run only because the former takes the outer pollutants into account. This online method helps to explain dynamics and to retrieve the chemical species redistribution with a good agreement.

  20. Simulation of ozone production in a complex circulation region using nested grids

    NASA Astrophysics Data System (ADS)

    Taghavi, M.; Cautenet, S.; Foret, G.

    2004-06-01

    During the ESCOMPTE precampaign (summer 2000, over Southern France), a 3-day period of intensive observation (IOP0), associated with ozone peaks, has been simulated. The comprehensive RAMS model, version 4.3, coupled on-line with a chemical module including 29 species, is used to follow the chemistry of the polluted zone. This efficient but time consuming method can be used because the code is installed on a parallel computer, the SGI 3800. Two runs are performed: run 1 with a single grid and run 2 with two nested grids. The simulated fields of ozone, carbon monoxide, nitrogen oxides and sulfur dioxide are compared with aircraft and surface station measurements. The 2-grid run looks substantially better than the run with one grid because the former takes the outer pollutants into account. This on-line method helps to satisfactorily retrieve the chemical species redistribution and to explain the impact of dynamics on this redistribution.

  1. Simulation of a conductive shield plate for the focalization of transcranial magnetic stimulation in the rat.

    PubMed

    Gasca, Fernando; Richter, Lars; Schweikard, Achim

    2010-01-01

    Transcranial Magnetic Stimulation (TMS) in the rat is a powerful tool for investigating brain function. However, the state-of-the-art experiments are considerably limited because the stimulation usually affects undesired anatomical structures. A simulation of a conductive shield plate placed between the coil stimulator and the rat brain during TMS is presented. The Finite Element (FE) method is used to obtain the 3D electric field distribution on a four-layer rat head model. The simulations show that the shield plate with a circular window can improve the focalization of stimulation, as quantitatively seen by computing the three-dimensional half power region (HPR). Focalization with the shield plate showed a clear compromise with the attenuation of the induced field. The results suggest that the shield plate can work as a helpful tool for conducting TMS rat experiments on specific targets.

  2. X-Ray Detector Simulations - Oral Presentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tina, Adrienne

    2015-08-20

    The free-electron laser at LCLS produces X-Rays that are used in several facilities. This light source is so bright and quick that we are capable of producing movies of objects like proteins. But making these movies would not be possible without a device that can detect the X-Rays and produce images. We need X-Ray cameras. The challenges LCLS faces include the X-Rays’ high repetition rate of 120 Hz, short pulses that can reach 200 femto-seconds, and extreme peak brightness. We need detectors that are compatible with this light source, but before they can be used in the facilities, they mustmore » first be characterized. My project was to do just that, by making a computer simulation program. My presentation discusses the individual detectors I simulated, the details of my program, and how my project will help determine which detector is most useful for a specific experiment.« less

  3. Slat Noise Simulations: Status and Challenges

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan M.; Lockard, David P.; Khorrami, Mehdi R.; Mineck, Raymond E.

    2011-01-01

    Noise radiation from the leading edge slat of a high-lift system is known to be an important component of aircraft noise during approach. NASA's Langley Research Center is engaged in a coordinated series of investigations combining high-fidelity numerical simulations and detailed wind tunnel measurements of a generic, unswept, 3-element, high-lift configuration. The goal of this effort is to provide a validated predictive capability that would enable identification of the dominant noise source mechanisms and, ultimately, help develop physics inspired concepts for reducing the far-field acoustic intensity. This paper provides a brief overview of the current status of the computational effort and describes new findings pertaining to the effects of the angle of attack on the aeroacoustics of the slat cove region. Finally, the interplay of the simulation campaign with the concurrently evolving development of a benchmark dataset for an international workshop on airframe noise is outlined.

  4. Bistable behavior of the lac operon in E. coli when induced with a mixture of lactose and TMG.

    PubMed

    Díaz-Hernández, Orlando; Santillán, Moisés

    2010-01-01

    In this work we investigate multistability in the lac operon of Escherichia coli when it is induced by a mixture of lactose and the non-metabolizable thiomethyl galactoside (TMG). In accordance with previously published experimental results and computer simulations, our simulations predict that: (1) when the system is induced by TMG, the system shows a discernible bistable behavior while, (2) when the system is induced by lactose, bistability does not disappear but excessively high concentrations of lactose would be required to observe it. Finally, our simulation results predict that when a mixture of lactose and TMG is used, the bistability region in the extracellular glucose concentration vs. extracellular lactose concentration parameter space changes in such a way that the model predictions regarding bistability could be tested experimentally. These experiments could help to solve a recent controversy regarding the existence of bistability in the lac operon under natural conditions.

  5. Optimizing product life cycle processes in design phase

    NASA Astrophysics Data System (ADS)

    Faneye, Ola. B.; Anderl, Reiner

    2002-02-01

    Life cycle concepts do not only serve as basis in assisting product developers understand the dependencies between products and their life cycles, they also help in identifying potential opportunities for improvement in products. Common traditional concepts focus mainly on energy and material flow across life phases, necessitating the availability of metrics derived from a reference product. Knowledge of life cycle processes won from an existing product is directly reused in its redesign. Depending on sales volume nevertheless, the environmental impact before product optimization can be substantial. With modern information technologies today, computer-aided life cycle methodologies can be applied well before product use. On the basis of a virtual prototype, life cycle processes are analyzed and optimized, using simulation techniques. This preventive approach does not only help in minimizing (or even eliminating) environmental burdens caused by product, costs incurred due to changes in real product can also be avoided. The paper highlights the relationship between product and life cycle and presents a computer-based methodology for optimizing the product life cycle during design, as presented by SFB 392: Design for Environment - Methods and Tools at Technical University, Darmstadt.

  6. A breakthrough for experiencing and understanding simulated physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1988-01-01

    The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.

  7. Inquiry style interactive virtual experiments: a case on circular motion

    NASA Astrophysics Data System (ADS)

    Zhou, Shaona; Han, Jing; Pelz, Nathaniel; Wang, Xiaojun; Peng, Liangyu; Xiao, Hua; Bao, Lei

    2011-11-01

    Interest in computer-based learning, especially in the use of virtual reality simulations is increasing rapidly. While there are good reasons to believe that technologies have the potential to improve teaching and learning, how to utilize the technology effectively in teaching specific content difficulties is challenging. To help students develop robust understandings of correct physics concepts, we have developed interactive virtual experiment simulations that have the unique feature of enabling students to experience force and motion via an analogue joystick, allowing them to feel the applied force and simultaneously see its effects. The simulations provide students learning experiences that integrate both scientific representations and low-level sensory cues such as haptic cues under a single setting. In this paper, we introduce a virtual experiment module on circular motion. A controlled study has been conducted to evaluate the impact of using this virtual experiment on students' learning of force and motion in the context of circular motion. The results show that the interactive virtual experiment method is preferred by students and is more effective in helping students grasp the physics concepts than the traditional education method such as problem-solving practices. Our research suggests that well-developed interactive virtual experiments can be useful tools in teaching difficult concepts in science.

  8. Numerical simulation of blast wave propagation in vicinity of standalone prism on flat plate

    NASA Astrophysics Data System (ADS)

    Valger, Svetlana; Fedorova, Natalya; Fedorov, Alexander

    2018-03-01

    In the paper, numerical simulation of shock wave propagation in the vicinity of a standalone prism and a prism with a cavity in front of it was carried out. The modeling was based on the solution of 3D Euler equations and Fluent software was used as a main computational tool. The algorithm for local dynamic mesh adaptation to high gradients of pressure was applied. The initial stage of the explosion of condensed explosive was described with the help of "Compressed balloon method". The research allowed describing the characteristic stages of the blast in a semi-closed space, the structure of secondary shock waves and their interaction with obstacles. The numerical approach in Fluent based on combining inviscid gas dynamics methods and "Compressed balloon method" was compared with the method which had been used by the authors earlier with the help of AUTODYN and which is based on the use of the hydrodynamic model of a material to describe state of detonation products. For the problem of shock wave propagation in the vicinity of standalone prism the comparison of the simulation results obtained using both the methods with the experimental data was performed on the dependence of static pressure and effective momentum on time for the characteristic points located on prism walls.

  9. SQUEEZE-E: The Optimal Solution for Molecular Simulations with Periodic Boundary Conditions.

    PubMed

    Wassenaar, Tsjerk A; de Vries, Sjoerd; Bonvin, Alexandre M J J; Bekker, Henk

    2012-10-09

    In molecular simulations of macromolecules, it is desirable to limit the amount of solvent in the system to avoid spending computational resources on uninteresting solvent-solvent interactions. As a consequence, periodic boundary conditions are commonly used, with a simulation box chosen as small as possible, for a given minimal distance between images. Here, we describe how such a simulation cell can be set up for ensembles, taking into account a priori available or estimable information regarding conformational flexibility. Doing so ensures that any conformation present in the input ensemble will satisfy the distance criterion during the simulation. This helps avoid periodicity artifacts due to conformational changes. The method introduces three new approaches in computational geometry: (1) The first is the derivation of an optimal packing of ensembles, for which the mathematical framework is described. (2) A new method for approximating the α-hull and the contact body for single bodies and ensembles is presented, which is orders of magnitude faster than existing routines, allowing the calculation of packings of large ensembles and/or large bodies. 3. A routine is described for searching a combination of three vectors on a discretized contact body forming a reduced base for a lattice with minimal cell volume. The new algorithms reduce the time required to calculate packings of single bodies from minutes or hours to seconds. The use and efficacy of the method is demonstrated for ensembles obtained from NMR, MD simulations, and elastic network modeling. An implementation of the method has been made available online at http://haddock.chem.uu.nl/services/SQUEEZE/ and has been made available as an option for running simulations through the weNMR GRID MD server at http://haddock.science.uu.nl/enmr/services/GROMACS/main.php .

  10. Automated Help System For A Supercomputer

    NASA Technical Reports Server (NTRS)

    Callas, George P.; Schulbach, Catherine H.; Younkin, Michael

    1994-01-01

    Expert-system software developed to provide automated system of user-helping displays in supercomputer system at Ames Research Center Advanced Computer Facility. Users located at remote computer terminals connected to supercomputer and each other via gateway computers, local-area networks, telephone lines, and satellite links. Automated help system answers routine user inquiries about how to use services of computer system. Available 24 hours per day and reduces burden on human experts, freeing them to concentrate on helping users with complicated problems.

  11. A multifrequency virtual spectrometer for complex bio-organic systems: vibronic and environmental effects on the UV/Vis spectrum of chlorophyll a.

    PubMed

    Barone, Vincenzo; Biczysko, Malgorzata; Borkowska-Panek, Monika; Bloino, Julien

    2014-10-20

    The subtle interplay of several different effects means that the interpretation and analysis of experimental spectra in terms of structural and dynamic characteristics is a challenging task. In this context, theoretical studies can be helpful, and as such, computational spectroscopy is rapidly evolving from a highly specialized research field toward a versatile and widespread tool. However, in the case of electronic spectra (e.g. UV/Vis, circular dichroism, photoelectron, and X-ray spectra), the most commonly used methods still rely on the computation of vertical excitation energies, which are further convoluted to simulate line shapes. Such treatment completely neglects the influence of nuclear motions, despite the well-recognized notion that a proper account of vibronic effects is often mandatory to correctly interpret experimental findings. Development and validation of improved models rooted into density functional theory (DFT) and its time-dependent extension (TD-DFT) is of course instrumental for the optimal balance between reliability and favorable scaling with the number of electrons. However, the implementation of easy-to-use and effective procedures to simulate vibrationally resolved electronic spectra, and their availability to a wide community of users, is at least equally important for reliable simulations of spectral line shapes for compounds of biological and technological interest. Here, such an approach has been applied to the study of the UV/Vis spectra of chlorophyll a. The results show that properly tailored approaches are feasible for state-of-the-art computational spectroscopy studies, and allow, with affordable computational resources, vibrational and environmental effects on the spectral line shapes to be taken into account for large systems. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. An intelligent robot for helping astronauts

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Grimm, K. A.; Pendleton, T. W.

    1994-01-01

    This paper describes the development status of a prototype supervised intelligent robot for space application for purposes of (1) helping the crew of a spacecraft such as the Space Station with various tasks, such as holding objects and retrieving/replacing tools and other objects from/into storage, and (2) for purposes of retrieving detached objects, such as equipment or crew, that have become separated from their spacecraft. In addition to this set of tasks in this low-Earth-orbiting spacecraft environment, it is argued that certain aspects of the technology can be viewed as generic in approach, thereby offering insight into intelligent robots for other tasks and environments. Candidate software architectures and their key technical issues which enable real work in real environments to be accomplished safely and robustly are addressed. Results of computer simulations of grasping floating objects are presented. Also described are characterization results on the usable reduced gravity environment in an aircraft flying parabola (to simulate weightlessness) and results on hardware performance there. These results show it is feasible to use that environment for evaluative testing of dexterous grasping based on real-time vision of freely rotating and translating objects.

  13. Incorrect support and missing center tolerances of phasing algorithms

    DOE PAGES

    Huang, Xiaojing; Nelson, Johanna; Steinbrener, Jan; ...

    2010-01-01

    In x-ray diffraction microscopy, iterative algorithms retrieve reciprocal space phase information, and a real space image, from an object's coherent diffraction intensities through the use of a priori information such as a finite support constraint. In many experiments, the object's shape or support is not well known, and the diffraction pattern is incompletely measured. We describe here computer simulations to look at the effects of both of these possible errors when using several common reconstruction algorithms. Overly tight object supports prevent successful convergence; however, we show that this can often be recognized through pathological behavior of the phase retrieval transfermore » function. Dynamic range limitations often make it difficult to record the central speckles of the diffraction pattern. We show that this leads to increasing artifacts in the image when the number of missing central speckles exceeds about 10, and that the removal of unconstrained modes from the reconstructed image is helpful only when the number of missing central speckles is less than about 50. In conclusion, this simulation study helps in judging the reconstructability of experimentally recorded coherent diffraction patterns.« less

  14. Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization

    DOE PAGES

    Xi, Maolong; Lu, Dan; Gui, Dongwei; ...

    2016-11-27

    Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so asmore » to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO 3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.« less

  15. Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization

    NASA Astrophysics Data System (ADS)

    Xi, Maolong; Lu, Dan; Gui, Dongwei; Qi, Zhiming; Zhang, Guannan

    2017-01-01

    Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so as to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.

  16. Calibration of an agricultural-hydrological model (RZWQM2) using surrogate global optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xi, Maolong; Lu, Dan; Gui, Dongwei

    Robust calibration of an agricultural-hydrological model is critical for simulating crop yield and water quality and making reasonable agricultural management. However, calibration of the agricultural-hydrological system models is challenging because of model complexity, the existence of strong parameter correlation, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time, which greatly restricts the successful application of the model. The goal of this study is to locate the optimal solution of the Root Zone Water Quality Model (RZWQM2) given a limited simulation time, so asmore » to improve the model simulation and help make rational and effective agricultural-hydrological decisions. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first used advanced sparse grid (SG) interpolation to construct a surrogate system of the actual RZWQM2, and then we calibrate the surrogate model using the global optimization algorithm, Quantum-behaved Particle Swarm Optimization (QPSO). As the surrogate model is a polynomial with fast evaluation, it can be efficiently evaluated with a sufficiently large number of times during the optimization, which facilitates the global search. We calibrate seven model parameters against five years of yield, drain flow, and NO 3-N loss data from a subsurface-drained corn-soybean field in Iowa. Results indicate that an accurate surrogate model can be created for the RZWQM2 with a relatively small number of SG points (i.e., RZWQM2 runs). Compared to the conventional QPSO algorithm, our surrogate-based optimization method can achieve a smaller objective function value and better calibration performance using a fewer number of expensive RZWQM2 executions, which greatly improves computational efficiency.« less

  17. Dual-energy contrast-enhanced digital mammography (DE-CEDM): optimization on digital subtraction with practical x-ray low/high-energy spectra

    NASA Astrophysics Data System (ADS)

    Chen, Biao; Jing, Zhenxue; Smith, Andrew P.; Parikh, Samir; Parisky, Yuri

    2006-03-01

    Dual-energy contrast enhanced digital mammography (DE-CEDM), which is based upon the digital subtraction of low/high-energy image pairs acquired before/after the administration of contrast agents, may provide physicians physiologic and morphologic information of breast lesions and help characterize their probability of malignancy. This paper proposes to use only one pair of post-contrast low / high-energy images to obtain digitally subtracted dual-energy contrast-enhanced images with an optimal weighting factor deduced from simulated characteristics of the imaging chain. Based upon our previous CEDM framework, quantitative characteristics of the materials and imaging components in the x-ray imaging chain, including x-ray tube (tungsten) spectrum, filters, breast tissues / lesions, contrast agents (non-ionized iodine solution), and selenium detector, were systemically modeled. Using the base-material (polyethylene-PMMA) decomposition method based on entrance low / high-energy x-ray spectra and breast thickness, the optimal weighting factor was calculated to cancel the contrast between fatty and glandular tissues while enhancing the contrast of iodized lesions. By contrast, previous work determined the optimal weighting factor through either a calibration step or through acquisition of a pre-contrast low/high-energy image pair. Computer simulations were conducted to determine weighting factors, lesions' contrast signal values, and dose levels as functions of x-ray techniques and breast thicknesses. Phantom and clinical feasibility studies were performed on a modified Selenia full field digital mammography system to verify the proposed method and computer-simulated results. The resultant conclusions from the computer simulations and phantom/clinical feasibility studies will be used in the upcoming clinical study.

  18. Evaluation of a computational model to predict elbow range of motion

    PubMed Central

    Nishiwaki, Masao; Johnson, James A.; King, Graham J. W.; Athwal, George S.

    2014-01-01

    Computer models capable of predicting elbow flexion and extension range of motion (ROM) limits would be useful for assisting surgeons in improving the outcomes of surgical treatment of patients with elbow contractures. A simple and robust computer-based model was developed that predicts elbow joint ROM using bone geometries calculated from computed tomography image data. The model assumes a hinge-like flexion-extension axis, and that elbow passive ROM limits can be based on terminal bony impingement. The model was validated against experimental results with a cadaveric specimen, and was able to predict the flexion and extension limits of the intact joint to 0° and 3°, respectively. The model was also able to predict the flexion and extension limits to 1° and 2°, respectively, when simulated osteophytes were inserted into the joint. Future studies based on this approach will be used for the prediction of elbow flexion-extension ROM in patients with primary osteoarthritis to help identify motion-limiting hypertrophic osteophytes, and will eventually permit real-time computer-assisted navigated excisions. PMID:24841799

  19. Design and implementation of the modified signed digit multiplication routine on a ternary optical computer.

    PubMed

    Xu, Qun; Wang, Xianchao; Xu, Chao

    2017-06-01

    Multiplication with traditional electronic computers is faced with a low calculating accuracy and a long computation time delay. To overcome these problems, the modified signed digit (MSD) multiplication routine is established based on the MSD system and the carry-free adder. Also, its parallel algorithm and optimization techniques are studied in detail. With the help of a ternary optical computer's characteristics, the structured data processor is designed especially for the multiplication routine. Several ternary optical operators are constructed to perform M transformations and summations in parallel, which has accelerated the iterative process of multiplication. In particular, the routine allocates data bits of the ternary optical processor based on digits of multiplication input, so the accuracy of the calculation results can always satisfy the users. Finally, the routine is verified by simulation experiments, and the results are in full compliance with the expectations. Compared with an electronic computer, the MSD multiplication routine is not only good at dealing with large-value data and high-precision arithmetic, but also maintains lower power consumption and fewer calculating delays.

  20. Symplectic molecular dynamics simulations on specially designed parallel computers.

    PubMed

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  1. The effects of computer-based dynamic visualization simulations on student learning in high school science

    NASA Astrophysics Data System (ADS)

    Moodley, Sadha

    The purpose of this study was to determine whether the use of dynamic computer-based visualizations of the classical model of particle behavior helps to improve student understanding, performance, and interest in science when used by teachers as visual presentations to complement their traditional methods of teaching. The software, Virtual Molecular Dynamics Laboratory (VMDL), was developed at the Center for Polymer Studies at Boston University through funding from the National Science Foundation. The design of the study included five pairs of classes in four different schools in New England from the inner city and from advantaged suburbs. The study employed a treatment-control group design for testing the impact of several VMDL simulations on student learning in several content areas from traditional chemistry and physical science courses. The study employed a mixed qualitative and quantitative design. The quantitative part involved administering the Group Assessment of Logical Thinking (GALT) as well as post-tests that were topic specific. An Analysis of Covariance (ANCOVA) was conducted on the test scores with the GALT scores serving as a covariate. Results of the ANCOVA showed that students' understanding and performance were better in classes where teachers used the computer-based dynamic visualizations to complement their traditional teaching. GALT scores were significantly different among schools but very similar within schools. They were significant in adjusting post-test scores for pre-treatment differences for only two of the schools. The treatment groups outscored the control groups in all five comparisons. The mean differences reached statistical significance at the p < .01 level in only four of the comparisons. The qualitative part of the study involved classroom observations and student interviews. Analysis of classroom observations revealed a shift in classroom dynamics to more learner-centeredness with greater engagement by students, especially in classes that tended to have little student participation without the simulations. Analysis of the student interviews indicated that the dynamic visualizations made learning more enjoyable, helped with remembering, and enhanced students abilities to make connections between the nanoscopic and macroscopic science.

  2. Augmenting Sand Simulation Environments through Subdivision and Particle Refinement

    NASA Astrophysics Data System (ADS)

    Clothier, M.; Bailey, M.

    2012-12-01

    Recent advances in computer graphics and parallel processing hardware have provided disciplines with new methods to evaluate and visualize data. These advances have proven useful for earth and planetary scientists as many researchers are using this hardware to process large amounts of data for analysis. As such, this has provided opportunities for collaboration between computer graphics and the earth sciences. Through collaboration with the Oregon Space Grant and IGERT Ecosystem Informatics programs, we are investigating techniques for simulating the behavior of sand. We are also collaborating with the Jet Propulsion Laboratory's (JPL) DARTS Lab to exchange ideas and gain feedback on our research. The DARTS Lab specializes in simulation of planetary vehicles, such as the Mars rovers. Their simulations utilize a virtual "sand box" to test how a planetary vehicle responds to different environments. Our research builds upon this idea to create a sand simulation framework so that planetary environments, such as the harsh, sandy regions on Mars, are more fully realized. More specifically, we are focusing our research on the interaction between a planetary vehicle, such as a rover, and the sand beneath it, providing further insight into its performance. Unfortunately, this can be a computationally complex problem, especially if trying to represent the enormous quantities of sand particles interacting with each other. However, through the use of high-performance computing, we have developed a technique to subdivide areas of actively participating sand regions across a large landscape. Similar to a Level of Detail (LOD) technique, we only subdivide regions of a landscape where sand particles are actively participating with another object. While the sand is within this subdivision window and moves closer to the surface of the interacting object, the sand region subdivides into smaller regions until individual sand particles are left at the surface. As an example, let's say there is a planetary rover interacting with our sand simulation environment. Sand that is actively interacting with a rover wheel will be represented as individual particles whereas sand that is further under the surface will be represented by larger regions of sand. The result of this technique allows for many particles to be represented without the computational complexity. In developing this method, we have further generalized these subdivision regions into any volumetric area suitable for use in the simulation. This is a further improvement of our method as it allows for more compact subdivision sand regions. This helps to fine tune the simulation so that more emphasis can be placed on regions of actively participating sand. We feel that through the generalization of our technique, our research can provide other opportunities within the earth and planetary sciences. Through collaboration with our academic colleagues, we continue to refine our technique and look for other opportunities to utilize our research.

  3. Exploring GPCR-Lipid Interactions by Molecular Dynamics Simulations: Excitements, Challenges, and the Way Forward.

    PubMed

    Sengupta, Durba; Prasanna, Xavier; Mohole, Madhura; Chattopadhyay, Amitabha

    2018-06-07

    Gprotein-coupled receptors (GPCRs) are seven transmembrane receptors that mediate a large number of cellular responses and are important drug targets. One of the current challenges in GPCR biology is to analyze the molecular signatures of receptor-lipid interactions and their subsequent effects on GPCR structure, organization, and function. Molecular dynamics simulation studies have been successful in predicting molecular determinants of receptor-lipid interactions. In particular, predicted cholesterol interaction sites appear to correspond well with experimentally determined binding sites and estimated time scales of association. In spite of several success stories, the methodologies in molecular dynamics simulations are still emerging. In this Feature Article, we provide a comprehensive overview of coarse-grain and atomistic molecular dynamics simulations of GPCR-lipid interaction in the context of experimental observations. In addition, we discuss the effect of secondary and tertiary structural constraints in coarse-grain simulations in the context of functional dynamics and structural plasticity of GPCRs. We envision that this comprehensive overview will help resolve differences in computational studies and provide a way forward.

  4. Implicit integration methods for dislocation dynamics

    DOE PAGES

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; ...

    2015-01-20

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a waymore » of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.« less

  5. Virtual planning for craniomaxillofacial surgery--7 years of experience.

    PubMed

    Adolphs, Nicolai; Haberl, Ernst-Johannes; Liu, Weichen; Keeve, Erwin; Menneking, Horst; Hoffmeister, Bodo

    2014-07-01

    Contemporary computer-assisted surgery systems more and more allow for virtual simulation of even complex surgical procedures with increasingly realistic predictions. Preoperative workflows are established and different commercially software solutions are available. Potential and feasibility of virtual craniomaxillofacial surgery as an additional planning tool was assessed retrospectively by comparing predictions and surgical results. Since 2006 virtual simulation has been performed in selected patient cases affected by complex craniomaxillofacial disorders (n = 8) in addition to standard surgical planning based on patient specific 3d-models. Virtual planning could be performed for all levels of the craniomaxillofacial framework within a reasonable preoperative workflow. Simulation of even complex skeletal displacements corresponded well with the real surgical result and soft tissue simulation proved to be helpful. In combination with classic 3d-models showing the underlying skeletal pathology virtual simulation improved planning and transfer of craniomaxillofacial corrections. Additional work and expenses may be justified by increased possibilities of visualisation, information, instruction and documentation in selected craniomaxillofacial procedures. Copyright © 2013 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  6. BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.

    PubMed

    Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing

    2012-03-01

    BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.

  7. A Simple Evacuation Modeling and Simulation Tool for First Responders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Daniel B; Payne, Patricia W

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools canmore » quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.« less

  8. High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair.

    PubMed

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K

    2018-01-01

    Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed.

  9. High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair

    PubMed Central

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.

    2018-01-01

    Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed. PMID:29706894

  10. Particle kinetic simulation of high altitude hypervelocity flight

    NASA Technical Reports Server (NTRS)

    Boyd, Iain; Haas, Brian L.

    1994-01-01

    Rarefied flows about hypersonic vehicles entering the upper atmosphere or through nozzles expanding into a near vacuum may only be simulated accurately with a direct simulation Monte Carlo (DSMC) method. Under this grant, researchers enhanced the models employed in the DSMC method and performed simulations in support of existing NASA projects or missions. DSMC models were developed and validated for simulating rotational, vibrational, and chemical relaxation in high-temperature flows, including effects of quantized anharmonic oscillators and temperature-dependent relaxation rates. State-of-the-art advancements were made in simulating coupled vibration-dissociation recombination for post-shock flows. Models were also developed to compute vehicle surface temperatures directly in the code rather than requiring isothermal estimates. These codes were instrumental in simulating aerobraking of NASA's Magellan spacecraft during orbital maneuvers to assess heat transfer and aerodynamic properties of the delicate satellite. NASA also depended upon simulations of entry of the Galileo probe into the atmosphere of Jupiter to provide drag and flow field information essential for accurate interpretation of an onboard experiment. Finally, the codes have been used extensively to simulate expanding nozzle flows in low-power thrusters in support of propulsion activities at NASA-Lewis. Detailed comparisons between continuum calculations and DSMC results helped to quantify the limitations of continuum CFD codes in rarefied applications.

  11. Experimental and Computational Analysis of Unidirectional Flow Through Stirling Engine Heater Head

    NASA Technical Reports Server (NTRS)

    Wilson, Scott D.; Dyson, Rodger W.; Tew, Roy C.; Demko, Rikako

    2006-01-01

    A high efficiency Stirling Radioisotope Generator (SRG) is being developed for possible use in long-duration space science missions. NASA s advanced technology goals for next generation Stirling convertors include increasing the Carnot efficiency and percent of Carnot efficiency. To help achieve these goals, a multi-dimensional Computational Fluid Dynamics (CFD) code is being developed to numerically model unsteady fluid flow and heat transfer phenomena of the oscillating working gas inside Stirling convertors. In the absence of transient pressure drop data for the zero mean oscillating multi-dimensional flows present in the Technology Demonstration Convertors on test at NASA Glenn Research Center, unidirectional flow pressure drop test data is used to compare against 2D and 3D computational solutions. This study focuses on tracking pressure drop and mass flow rate data for unidirectional flow though a Stirling heater head using a commercial CFD code (CFD-ACE). The commercial CFD code uses a porous-media model which is dependent on permeability and the inertial coefficient present in the linear and nonlinear terms of the Darcy-Forchheimer equation. Permeability and inertial coefficient were calculated from unidirectional flow test data. CFD simulations of the unidirectional flow test were validated using the porous-media model input parameters which increased simulation accuracy by 14 percent on average.

  12. Two-dimensional CFD modeling of wave rotor flow dynamics

    NASA Technical Reports Server (NTRS)

    Welch, Gerard E.; Chima, Rodrick V.

    1994-01-01

    A two-dimensional Navier-Stokes solver developed for detailed study of wave rotor flow dynamics is described. The CFD model is helping characterize important loss mechanisms within the wave rotor. The wave rotor stationary ports and the moving rotor passages are resolved on multiple computational grid blocks. The finite-volume form of the thin-layer Navier-Stokes equations with laminar viscosity are integrated in time using a four-stage Runge-Kutta scheme. Roe's approximate Riemann solution scheme or the computationally less expensive advection upstream splitting method (AUSM) flux-splitting scheme is used to effect upwind-differencing of the inviscid flux terms, using cell interface primitive variables set by MUSCL-type interpolation. The diffusion terms are central-differenced. The solver is validated using a steady shock/laminar boundary layer interaction problem and an unsteady, inviscid wave rotor passage gradual opening problem. A model inlet port/passage charging problem is simulated and key features of the unsteady wave rotor flow field are identified. Lastly, the medium pressure inlet port and high pressure outlet port portion of the NASA Lewis Research Center experimental divider cycle is simulated and computed results are compared with experimental measurements. The model accurately predicts the wave timing within the rotor passages and the distribution of flow variables in the stationary inlet port region.

  13. Two-dimensional CFD modeling of wave rotor flow dynamics

    NASA Technical Reports Server (NTRS)

    Welch, Gerard E.; Chima, Rodrick V.

    1993-01-01

    A two-dimensional Navier-Stokes solver developed for detailed study of wave rotor flow dynamics is described. The CFD model is helping characterize important loss mechanisms within the wave rotor. The wave rotor stationary ports and the moving rotor passages are resolved on multiple computational grid blocks. The finite-volume form of the thin-layer Navier-Stokes equations with laminar viscosity are integrated in time using a four-stage Runge-Kutta scheme. The Roe approximate Riemann solution scheme or the computationally less expensive Advection Upstream Splitting Method (AUSM) flux-splitting scheme are used to effect upwind-differencing of the inviscid flux terms, using cell interface primitive variables set by MUSCL-type interpolation. The diffusion terms are central-differenced. The solver is validated using a steady shock/laminar boundary layer interaction problem and an unsteady, inviscid wave rotor passage gradual opening problem. A model inlet port/passage charging problem is simulated and key features of the unsteady wave rotor flow field are identified. Lastly, the medium pressure inlet port and high pressure outlet port portion of the NASA Lewis Research Center experimental divider cycle is simulated and computed results are compared with experimental measurements. The model accurately predicts the wave timing within the rotor passage and the distribution of flow variables in the stationary inlet port region.

  14. Using block pulse functions for seismic vibration semi-active control of structures with MR dampers

    NASA Astrophysics Data System (ADS)

    Rahimi Gendeshmin, Saeed; Davarnia, Daniel

    2018-03-01

    This article applied the idea of block pulse functions in the semi-active control of structures. The BP functions give effective tools to approximate complex problems. The applied control algorithm has a major effect on the performance of the controlled system and the requirements of the control devices. In control problems, it is important to devise an accurate analytical technique with less computational cost. It is proved that the BP functions are fundamental tools in approximation problems which have been applied in disparate areas in last decades. This study focuses on the employment of BP functions in control algorithm concerning reduction the computational cost. Magneto-rheological (MR) dampers are one of the well-known semi-active tools that can be used to control the response of civil Structures during earthquake. For validation purposes, numerical simulations of a 5-story shear building frame with MR dampers are presented. The results of suggested method were compared with results obtained by controlling the frame by the optimal control method based on linear quadratic regulator theory. It can be seen from simulation results that the suggested method can be helpful in reducing seismic structural responses. Besides, this method has acceptable accuracy and is in agreement with optimal control method with less computational costs.

  15. Development of Novel PEM Membrane and Multiphase CD Modeling of PEM Fuel Cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    K. J. Berry; Susanta Das

    2009-12-30

    To understand heat and water management phenomena better within an operational proton exchange membrane fuel cell's (PEMFC) conditions, a three-dimensional, two-phase computational fluid dynamic (CFD) flow model has been developed and simulated for a complete PEMFC. Both liquid and gas phases are considered in the model by taking into account the gas flow, diffusion, charge transfer, change of phase, electro-osmosis, and electrochemical reactions to understand the overall dynamic behaviors of species within an operating PEMFC. The CFD model is solved numerically under different parametric conditions in terms of water management issues in order to improve cell performance. The results obtainedmore » from the CFD two-phase flow model simulations show improvement in cell performance as well as water management under PEMFCs operational conditions as compared to the results of a single phase flow model available in the literature. The quantitative information obtained from the two-phase model simulation results helped to develop a CFD control algorithm for low temperature PEM fuel cell stacks which opens up a route in designing improvement of PEMFC for better operational efficiency and performance. To understand heat and water management phenomena better within an operational proton exchange membrane fuel cell's (PEMFC) conditions, a three-dimensional, two-phase computational fluid dynamic (CFD) flow model has been developed and simulated for a complete PEMFC. Both liquid and gas phases are considered in the model by taking into account the gas flow, diffusion, charge transfer, change of phase, electro-osmosis, and electrochemical reactions to understand the overall dynamic behaviors of species within an operating PEMFC. The CFD model is solved numerically under different parametric conditions in terms of water management issues in order to improve cell performance. The results obtained from the CFD two-phase flow model simulations show improvement in cell performance as well as water management under PEMFCs operational conditions as compared to the results of a single phase flow model available in the literature. The quantitative information obtained from the two-phase model simulation results helped to develop a CFD control algorithm for low temperature PEM fuel cell stacks which opens up a route in designing improvement of PEMFC for better operational efficiency and performance.« less

  16. Clinical applications of virtual navigation bronchial intervention.

    PubMed

    Kajiwara, Naohiro; Maehara, Sachio; Maeda, Junichi; Hagiwara, Masaru; Okano, Tetsuya; Kakihana, Masatoshi; Ohira, Tatsuo; Kawate, Norihiko; Ikeda, Norihiko

    2018-01-01

    In patients with bronchial tumors, we frequently consider endoscopic treatment as the first treatment of choice. All computed tomography (CT) must satisfy several conditions necessary to analyze images by Synapse Vincent. To select safer and more precise approaches for patients with bronchial tumors, we determined the indications and efficacy of virtual navigation intervention for the treatment of bronchial tumors. We examined the efficacy of virtual navigation bronchial intervention for the treatment of bronchial tumors located at a variety of sites in the tracheobronchial tree using a high-speed 3-dimensional (3D) image analysis system, Synapse Vincent. Constructed images can be utilized to decide on the simulation and interventional strategy as well as for navigation during interventional manipulation in two cases. Synapse Vincent was used to determine the optimal planning of virtual navigation bronchial intervention. Moreover, this system can detect tumor location and alsodepict surrounding tissues, quickly, accurately, and safely. The feasibility and safety of Synapse Vincent in performing useful preoperative simulation and navigation of surgical procedures can lead to safer, more precise, and less invasion for the patient, and makes it easy to construct an image, depending on the purpose, in 5-10 minutes using Synapse Vincent. Moreover, if the lesion is in the parenchyma or sub-bronchial lumen, it helps to perform simulation with virtual skeletal subtraction to estimate potential lesion movement. By using virtual navigation system for simulation, bronchial intervention was performed with no complications safely and precisely. Preoperative simulation using virtual navigation bronchial intervention reduces the surgeon's stress levels, particularly when highly skilled techniques are needed to operate on lesions. This task, including both preoperative simulation and intraoperative navigation, leads to greater safety and precision. These technological instruments are helpful for bronchial intervention procedures, and are also excellent devices for educational training.

  17. Macro-Scale Reactive Flow Model for High-Explosive Detonation in Support of ASCI Weapon Safety Milepost

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reaugh, J E

    2002-01-03

    Explosive grain-scale simulations are not practical for weapon safety simulations. Indeed for nearly ideal explosives with reaction zones of order 500 {micro}m, even reactive flow models are not practical for weapon safety simulations. By design, reactive flow models must resolve the reaction zone, which implies computational cells with dimension of order 50 {micro}m for such explosives. The desired result for a simulation in which the reaction zone is not resolved is that the explosive behaves as an ideal one. The pressure at the shock front rises to the Chapman-Jouget (CJ) pressure with a reaction zone dimension that is like thatmore » of a shock propagating in an unreactive medium, on the order of a few computational cells. It should propagate with the detonation velocity that is determined by the equation of state of the products. In the past, this was achieved in one dimensional simulations with ''beta-burn'', a method in which the extent of conversion to final product is proportional to the approach of the specific volume in the shock front to the specific volume of the CJ state. One drawback with this method is that there is a relatively long build-up to steady detonation that is typically 50 to 100 computational cells. The need for relatively coarsely zoned simulations in two dimensions lead to ''program-burn'' by which the time to detonation can be determined by a simple ray-tracing algorithm when there are no barriers or shadows. Complications arise in two and three dimensions to the extent that some calculations of the lighting time in complex geometry can give incorrect results. We sought to develop a model based on reactive flow that might help the needs of the Weapon Safety Simulation milepost. Important features of the model are: (1) That it be useable with any equation of state description of the explosive product gases including both JWL and LEOS table forms. (2) That it exhibits the desired dependence on zone size. We believe that the model described here does exhibit these features.« less

  18. Numerical Simulations of Spacecraft Charging: Selected Applications

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Delzanno, G. L.; Meierbachtol, C.; Svyatskiy, D.; Vernon, L.; Borovsky, J.; Thomsen, M. F.

    2016-12-01

    The electrical charging of spacecraft due to bombarding charged particles affects their performance and operation. We study this charging using CPIC, a particle-in-cell code specifically designed for studying plasma-material interactions. CPIC is based on multi-block curvilinear meshes, resulting in near-optimal computational performance while maintaining geometric accuracy. It is interfaced to a mesh generator that creates a computational mesh conforming to complex objects like a spacecraft. Relevant plasma parameters can be imported from the SHIELDS framework (currently under development at LANL), which simulates geomagnetic storms and substorms in the Earth's magnetosphere. Selected physics results will be presented, together with an overview of the code. The physics results include spacecraft-charging simulations with geometry representative of the Van Allen Probes spacecraft, focusing on the conditions that can lead to significant spacecraft charging events. Second, results from a recent study that investigates the conditions for which a high-power (>keV) electron beam could be emitted from a magnetospheric spacecraft will be presented. The latter study proposes a spacecraft-charging mitigation strategy based on the plasma contactor technology that might allow beam experiments to operate in the low-density magnetosphere. High-power electron beams could be used for instance to establish magnetic-field-line connectivity between ionosphere and magnetosphere and help solving long-standing questions in ionospheric/magnetospheric physics.

  19. A DFT-Based Computational-Experimental Methodology for Synthetic Chemistry: Example of Application to the Catalytic Opening of Epoxides by Titanocene.

    PubMed

    Jaraíz, Martín; Enríquez, Lourdes; Pinacho, Ruth; Rubio, José E; Lesarri, Alberto; López-Pérez, José L

    2017-04-07

    A novel DFT-based Reaction Kinetics (DFT-RK) simulation approach, employed in combination with real-time data from reaction monitoring instrumentation (like UV-vis, FTIR, Raman, and 2D NMR benchtop spectrometers), is shown to provide a detailed methodology for the analysis and design of complex synthetic chemistry schemes. As an example, it is applied to the opening of epoxides by titanocene in THF, a catalytic system with abundant experimental data available. Through a DFT-RK analysis of real-time IR data, we have developed a comprehensive mechanistic model that opens new perspectives to understand previous experiments. Although derived specifically from the opening of epoxides, the prediction capabilities of the model, built on elementary reactions, together with its practical side (reaction kinetics simulations of real experimental conditions) make it a useful simulation tool for the design of new experiments, as well as for the conception and development of improved versions of the reagents. From the perspective of the methodology employed, because both the computational (DFT-RK) and the experimental (spectroscopic data) components can follow the time evolution of several species simultaneously, it is expected to provide a helpful tool for the study of complex systems in synthetic chemistry.

  20. Computational modelling of the piglet brain to simulate near-infrared spectroscopy and magnetic resonance spectroscopy data collected during oxygen deprivation.

    PubMed

    Moroz, Tracy; Banaji, Murad; Robertson, Nicola J; Cooper, Chris E; Tachtsidis, Ilias

    2012-07-07

    We describe a computational model to simulate measurements from near-infrared spectroscopy (NIRS) and magnetic resonance spectroscopy (MRS) in the piglet brain. Piglets are often subjected to anoxic, hypoxic and ischaemic insults, as experimental models for human neonates. The model aims to help interpret measurements and increase understanding of physiological processes occurring during such insults. It is an extension of a previous model of circulation and mitochondrial metabolism. This was developed to predict NIRS measurements in the brains of healthy adults i.e. concentration changes of oxyhaemoglobin and deoxyhaemoglobin and redox state changes of cytochrome c oxidase (CCO). We altered and enhanced the model to apply to the anaesthetized piglet brain. It now includes metabolites measured by (31)P-MRS, namely phosphocreatine, inorganic phosphate and adenosine triphosphate (ATP). It also includes simple descriptions of glycolysis, lactate dynamics and the tricarboxylic acid (TCA) cycle. The model is described, and its simulations compared with existing measurements from piglets during anoxia. The NIRS and MRS measurements are predicted well, although this requires a reduction in blood pressure autoregulation. Predictions of the cerebral metabolic rate of oxygen consumption (CMRO(2)) and lactate concentration, which were not measured, are given. Finally, the model is used to investigate hypotheses regarding changes in CCO redox state during anoxia.

Top