Sample records for computational science simulations

  1. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  2. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  3. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  4. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  5. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  6. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  7. Reconsidering Simulations in Science Education at a Distance: Features of Effective Use

    ERIC Educational Resources Information Center

    Blake, C.; Scanlon, E.

    2007-01-01

    This paper proposes a reconsideration of use of computer simulations in science education. We discuss three studies of the use of science simulations for undergraduate distance learning students. The first one, "The Driven Pendulum" simulation is a computer-based experiment on the behaviour of a pendulum. The second simulation, "Evolve" is…

  8. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  9. Instructional Support and Implementation Structure during Elementary Teachers' Science Education Simulation Use

    ERIC Educational Resources Information Center

    Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.

    2016-01-01

    This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results…

  10. Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-01-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is…

  11. The Effect of Teacher Involvement on Student Performance in a Computer-Based Science Simulation.

    ERIC Educational Resources Information Center

    Waugh, Michael L.

    Designed to investigate whether or not science teachers can positively influence student achievement in, and attitude toward, science, this study focused on a specific teaching strategy and utilization of a computer-based simulation. The software package used in the study was the simulation, Volcanoes, by Earthware Computer Services. The sample…

  12. Introducing Molecular Life Science Students to Model Building Using Computer Simulations

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred

    2006-01-01

    Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…

  13. Learning Oceanography from a Computer Simulation Compared with Direct Experience at Sea

    ERIC Educational Resources Information Center

    Winn, William; Stahr, Frederick; Sarason, Christian; Fruland, Ruth; Oppenheimer, Peter; Lee, Yen-Ling

    2006-01-01

    Considerable research has compared how students learn science from computer simulations with how they learn from "traditional" classes. Little research has compared how students learn science from computer simulations with how they learn from direct experience in the real environment on which the simulations are based. This study compared two…

  14. Computer Simulation in Social Science.

    ERIC Educational Resources Information Center

    Garson, G. David

    From a base in military models, computer simulation has evolved to provide a wide variety of applications in social science. General purpose simulation packages and languages such as FIRM, DYNAMO, and others have made significant contributions toward policy discussion in the social sciences and have well-documented efficacy in instructional…

  15. Epistemological Issues Concerning Computer Simulations in Science and Their Implications for Science Education

    ERIC Educational Resources Information Center

    Greca, Ileana M.; Seoane, Eugenia; Arriassecq, Irene

    2014-01-01

    Computers and simulations represent an undeniable aspect of daily scientific life, the use of simulations being comparable to the introduction of the microscope and the telescope, in the development of knowledge. In science education, simulations have been proposed for over three decades as useful tools to improve the conceptual understanding of…

  16. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    ERIC Educational Resources Information Center

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  17. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  18. A meta-analysis of outcomes from the use of computer-simulated experiments in science education

    NASA Astrophysics Data System (ADS)

    Lejeune, John Van

    The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.

  19. On the "Exchangeability" of Hands-On and Computer-Simulated Science Performance Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Rosenquist, Anders; Shavelson, Richard J.; Ruiz-Primo, Maria Araceli

    Inconsistencies in scores from computer-simulated and "hands-on" science performance assessments have led to questions about the exchangeability of these two methods in spite of the highly touted potential of computer-simulated performance assessment. This investigation considered possible explanations for students' inconsistent performances: (1)…

  20. Using a Computer Simulation To Teach Science Process Skills to College Biology and Elementary Education Majors.

    ERIC Educational Resources Information Center

    Lee, Aimee T.; Hairston, Rosalina V.; Thames, Rachel; Lawrence, Tonya; Herron, Sherry S.

    2002-01-01

    Describes the Lateblight computer simulation implemented in the general biology laboratory and science methods course for elementary teachers to reinforce the processes of science and allow students to engage, explore, explain, elaborate, and evaluate the methods of building concepts in science. (Author/KHR)

  1. The Development and Evaluation of a Computer-Simulated Science Inquiry Environment Using Gamified Elements

    ERIC Educational Resources Information Center

    Tsai, Fu-Hsing

    2018-01-01

    This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…

  2. The Role of Crop Systems Simulation in Agriculture and Environment

    USDA-ARS?s Scientific Manuscript database

    Over the past 30 to 40 years, simulation of crop systems has advanced from a neophyte science with inadequate computing power into a robust and increasingly accepted science supported by improved software, languages, development tools, and computer capabilities. Crop system simulators contain mathe...

  3. Computational Science and Innovation

    NASA Astrophysics Data System (ADS)

    Dean, D. J.

    2011-09-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  4. Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock

    This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less

  5. Computer Simulations of Quantum Theory of Hydrogen Atom for Natural Science Education Students in a Virtual Lab

    ERIC Educational Resources Information Center

    Singh, Gurmukh

    2012-01-01

    The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…

  6. Employing Inquiry-Based Computer Simulations and Embedded Scientist Videos to Teach Challenging Climate Change and Nature of Science Concepts

    ERIC Educational Resources Information Center

    Cohen, Edward Charles

    2013-01-01

    Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known…

  7. Instructional support and implementation structure during elementary teachers' science education simulation use

    NASA Astrophysics Data System (ADS)

    Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.

    2016-07-01

    This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results indicated teachers used a one-to-one student-to-computer ratio most often either during class-wide individual computer use or during a rotating station structure. Worksheets, general support, and peer collaboration were the most common forms of instructional support. The least common instructional support forms included lesson pacing, initial play, and a closure discussion. Students' simulation use was supported in the fewest ways during a rotating station structure. Results suggest that simulation professional development with elementary teachers needs to explicitly focus on implementation structures and instructional support to enhance participants' pedagogical knowledge and improve instructional simulation use. In addition, research is needed to provide theoretical explanations for the observed patterns that should subsequently be addressed in supporting teachers' instructional simulation use during professional development or in teacher preparation programs.

  8. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    ERIC Educational Resources Information Center

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  9. The Impact of Interactive Computer Simulations on the Nature and Quality of Postgraduate Science Teachers' Explanations in Physics

    ERIC Educational Resources Information Center

    Zacharia, Zacharias C.

    2005-01-01

    This study investigated how individuals' construction of explanations--a way of ascertaining how well an individual understands a concept--develops from an interactive simulation. Specifically, the purpose was to investigate the effect of interactive computer simulations or science textbook assignments on the nature and quality of postgraduate…

  10. Enhancing Computer Science Education with a Wireless Intelligent Simulation Environment

    ERIC Educational Resources Information Center

    Cook, Diane J.; Huber, Manfred; Yerraballi, Ramesh; Holder, Lawrence B.

    2004-01-01

    The goal of this project is to develop a unique simulation environment that can be used to increase students' interest and expertise in Computer Science curriculum. Hands-on experience with physical or simulated equipment is an essential ingredient for learning, but many approaches to training develop a separate piece of equipment or software for…

  11. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  12. Trends in Social Science: The Impact of Computational and Simulative Models

    NASA Astrophysics Data System (ADS)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  13. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Shawn A.

    The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.

  14. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  15. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  16. Computer-simulated laboratory explorations for middle school life, earth, and physical Science

    NASA Astrophysics Data System (ADS)

    von Blum, Ruth

    1992-06-01

    Explorations in Middle School Science is a set of 72 computer-simulated laboratory lessons in life, earth, and physical Science for grades 6 9 developed by Jostens Learning Corporation with grants from the California State Department of Education and the National Science Foundation.3 At the heart of each lesson is a computer-simulated laboratory that actively involves students in doing science improving their: (1) understanding of science concepts by applying critical thinking to solve real problems; (2) skills in scientific processes and communications; and (3) attitudes about science. Students use on-line tools (notebook, calculator, word processor) to undertake in-depth investigations of phenomena (like motion in outer space, disease transmission, volcanic eruptions, or the structure of the atom) that would be too difficult, dangerous, or outright impossible to do in a “live” laboratory. Suggested extension activities lead students to hands-on investigations, away from the computer. This article presents the underlying rationale, instructional model, and process by which Explorations was designed and developed. It also describes the general courseware structure and three lesson's in detail, as well as presenting preliminary data from the evaluation. Finally, it suggests a model for incorporating technology into the science classroom.

  17. Designing Online Scaffolds for Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high school…

  18. High performance computing for advanced modeling and simulation of materials

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang

    2017-02-01

    The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.

  19. Virtual Transgenics: Using a Molecular Biology Simulation to Impact Student Academic Achievement and Attitudes

    NASA Astrophysics Data System (ADS)

    Shegog, Ross; Lazarus, Melanie M.; Murray, Nancy G.; Diamond, Pamela M.; Sessions, Nathalie; Zsigmond, Eva

    2012-10-01

    The transgenic mouse model is useful for studying the causes and potential cures for human genetic diseases. Exposing high school biology students to laboratory experience in developing transgenic animal models is logistically prohibitive. Computer-based simulation, however, offers this potential in addition to advantages of fidelity and reach. This study describes and evaluates a computer-based simulation to train advanced placement high school science students in laboratory protocols, a transgenic mouse model was produced. A simulation module on preparing a gene construct in the molecular biology lab was evaluated using a randomized clinical control design with advanced placement high school biology students in Mercedes, Texas ( n = 44). Pre-post tests assessed procedural and declarative knowledge, time on task, attitudes toward computers for learning and towards science careers. Students who used the simulation increased their procedural and declarative knowledge regarding molecular biology compared to those in the control condition (both p < 0.005). Significant increases continued to occur with additional use of the simulation ( p < 0.001). Students in the treatment group became more positive toward using computers for learning ( p < 0.001). The simulation did not significantly affect attitudes toward science in general. Computer simulation of complex transgenic protocols have potential to provide a "virtual" laboratory experience as an adjunct to conventional educational approaches.

  20. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  1. A Quantitative Model for Assessing Visual Simulation Software Architecture

    DTIC Science & Technology

    2011-09-01

    Software Engineering Arnold Buss Research Associate Professor of MOVES LtCol Jeff Boleng, PhD Associate Professor of Computer Science U.S. Air Force Academy... science (operating and programming systems series). New York, NY, USA: Elsevier Science Ltd. Henry, S., & Kafura, D. (1984). The evaluation of software...Rudy Darken Professor of Computer Science Dissertation Supervisor Ted Lewis Professor of Computer Science Richard Riehle Professor of Practice

  2. Using Microcomputers Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    1985-01-01

    Presents a discussion of how computer simulations are used in two undergraduate social science courses and a faculty computer literacy course on simulations and artificial intelligence. Includes a list of 60 simulations for use on mainframes and microcomputers. Entries include type of hardware required, publisher's address, and cost. Sample…

  3. Metaconceptually-Enhanced Simulation-Based Inquiry: Effects on Eighth Grade Students' Conceptual Change and Science Epistemic Beliefs

    ERIC Educational Resources Information Center

    Huang, Kun; Ge, Xun; Eseryel, Deniz

    2017-01-01

    This study investigated the effects of metaconceptually-enhanced, simulation-based inquiry learning on eighth grade students' conceptual change in science and their development of science epistemic beliefs. Two experimental groups studied the topics of motion and force using the same computer simulations but with different simulation guides: one…

  4. Computer simulations in the high school: students' cognitive stages, science process skills and academic achievement in microbiology

    NASA Astrophysics Data System (ADS)

    Huppert, J.; Michal Lomask, S.; Lazarowitz, R.

    2002-08-01

    Computer-assisted learning, including simulated experiments, has great potential to address the problem solving process which is a complex activity. It requires a highly structured approach in order to understand the use of simulations as an instructional device. This study is based on a computer simulation program, 'The Growth Curve of Microorganisms', which required tenth grade biology students to use problem solving skills whilst simultaneously manipulating three independent variables in one simulated experiment. The aims were to investigate the computer simulation's impact on students' academic achievement and on their mastery of science process skills in relation to their cognitive stages. The results indicate that the concrete and transition operational students in the experimental group achieved significantly higher academic achievement than their counterparts in the control group. The higher the cognitive operational stage, the higher students' achievement was, except in the control group where students in the concrete and transition operational stages did not differ. Girls achieved equally with the boys in the experimental group. Students' academic achievement may indicate the potential impact a computer simulation program can have, enabling students with low reasoning abilities to cope successfully with learning concepts and principles in science which require high cognitive skills.

  5. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  6. A Primer on Simulation and Gaming.

    ERIC Educational Resources Information Center

    Barton, Richard F.

    In a primer intended for the administrative professions, for the behavioral sciences, and for education, simulation and its various aspects are defined, illustrated, and explained. Man-model simulation, man-computer simulation, all-computer simulation, and analysis are discussed as techniques for studying object systems (parts of the "real…

  7. Data Serving Climate Simulation Science at the NASA Center for Climate Simulation

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen M.

    2011-01-01

    The NASA Center for Climate Simulation (NCCS) provides high performance computational resources, a multi-petabyte archive, and data services in support of climate simulation research and other NASA-sponsored science. This talk describes the NCCS's data-centric architecture and processing, which are evolving in anticipation of researchers' growing requirements for higher resolution simulations and increased data sharing among NCCS users and the external science community.

  8. Computer Simulations: Inelegant Mathematics and Worse Social Science?

    ERIC Educational Resources Information Center

    Alker, Hayward R., Jr.

    1974-01-01

    Achievements, limitations, and difficulties of social science simulation efforts are discussed with particular reference to three examples. The pedagogical use of complementary developmental, philosophical, mathematical, and scientific approaches is advocated to minimize potential abuses of social simulation research. (LS)

  9. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  10. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  11. Simulation of flexible appendage interactions with Mariner Venus/Mercury attitude control and science platform pointing

    NASA Technical Reports Server (NTRS)

    Fleischer, G. E.

    1973-01-01

    A new computer subroutine, which solves the attitude equations of motion for any vehicle idealized as a topological tree of hinge-connected rigid bodies, is used to simulate and analyze science instrument pointing control interaction with a flexible Mariner Venus/Mercury (MVM) spacecraft. The subroutine's user options include linearized or partially linearized hinge-connected models whose computational advantages are demonstrated for the MVM problem. Results of the pointing control/flexible vehicle interaction simulations, including imaging experiment pointing accuracy predictions and implications for MVM science sequence planning, are described in detail.

  12. Scientific Visualization and Computational Science: Natural Partners

    NASA Technical Reports Server (NTRS)

    Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization is developing rapidly, stimulated by computational science, which is gaining acceptance as a third alternative to theory and experiment. Computational science is based on numerical simulations of mathematical models derived from theory. But each individual simulation is like a hypothetical experiment; initial conditions are specified, and the result is a record of the observed conditions. Experiments can be simulated for situations that can not really be created or controlled. Results impossible to measure can be computed.. Even for observable values, computed samples are typically much denser. Numerical simulations also extend scientific exploration where the mathematics is analytically intractable. Numerical simulations are used to study phenomena from subatomic to intergalactic scales and from abstract mathematical structures to pragmatic engineering of everyday objects. But computational science methods would be almost useless without visualization. The obvious reason is that the huge amounts of data produced require the high bandwidth of the human visual system, and interactivity adds to the power. Visualization systems also provide a single context for all the activities involved from debugging the simulations, to exploring the data, to communicating the results. Most of the presentations today have their roots in image processing, where the fundamental task is: Given an image, extract information about the scene. Visualization has developed from computer graphics, and the inverse task: Given a scene description, make an image. Visualization extends the graphics paradigm by expanding the possible input. The goal is still to produce images; the difficulty is that the input is not a scene description displayable by standard graphics methods. Visualization techniques must either transform the data into a scene description or extend graphics techniques to display this odd input. Computational science is a fertile field for visualization research because the results vary so widely and include things that have no known appearance. The amount of data creates additional challenges for both hardware and software systems. Evaluations of visualization should ultimately reflect the insight gained into the scientific phenomena. So making good visualizations requires consideration of characteristics of the user and the purpose of the visualization. Knowledge about human perception and graphic design is also relevant. It is this breadth of knowledge that stimulates proposals for multidisciplinary visualization teams and intelligent visualization assistant software. Visualization is an immature field, but computational science is stimulating research on a broad front.

  13. Computer Literacy for Life Sciences: Helping the Digital-Era Biology Undergraduates Face Today's Research

    ERIC Educational Resources Information Center

    Smolinski, Tomasz G.

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…

  14. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    NASA Astrophysics Data System (ADS)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  15. Scalable Game Design: A Strategy to Bring Systemic Computer Science Education to Schools through Game Design and Simulation Creation

    ERIC Educational Resources Information Center

    Repenning, Alexander; Webb, David C.; Koh, Kyu Han; Nickerson, Hilarie; Miller, Susan B.; Brand, Catharine; Her Many Horses, Ian; Basawapatna, Ashok; Gluck, Fred; Grover, Ryan; Gutierrez, Kris; Repenning, Nadia

    2015-01-01

    An educated citizenry that participates in and contributes to science technology engineering and mathematics innovation in the 21st century will require broad literacy and skills in computer science (CS). School systems will need to give increased attention to opportunities for students to engage in computational thinking and ways to promote a…

  16. Specification and Analysis of Parallel Machine Architecture

    DTIC Science & Technology

    1990-03-17

    Parallel Machine Architeture C.V. Ramamoorthy Computer Science Division Dept. of Electrical Engineering and Computer Science University of California...capacity. (4) Adaptive: The overhead in resolution of deadlocks, etc. should be in proportion to their frequency. (5) Avoid rollbacks: Rollbacks can be...snapshots of system state graphically at a rate proportional to simulation time. Some of the examples are as follow: (1) When the simulation clock of

  17. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less

  18. Simulations in a Science and Society Course.

    ERIC Educational Resources Information Center

    Maier, Mark H.; Venanzi, Thomas

    1984-01-01

    Provides a course outline which includes simulation exercises designed as in-class activities related to science and society interactions. Simulations focus on the IQ debate, sociobiology, nuclear weapons and nulcear strategy, nuclear power and radiation, computer explosion, and cosmology. Indicates that learning improves when students take active…

  19. Spatial Learning and Computer Simulations in Science

    ERIC Educational Resources Information Center

    Lindgren, Robb; Schwartz, Daniel L.

    2009-01-01

    Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…

  20. Optimizing Cognitive Load for Learning from Computer-Based Science Simulations

    ERIC Educational Resources Information Center

    Lee, Hyunjeong; Plass, Jan L.; Homer, Bruce D.

    2006-01-01

    How can cognitive load in visual displays of computer simulations be optimized? Middle-school chemistry students (N = 257) learned with a simulation of the ideal gas law. Visual complexity was manipulated by separating the display of the simulations in two screens (low complexity) or presenting all information on one screen (high complexity). The…

  1. Integrating Computational Science Tools into a Thermodynamics Course

    ERIC Educational Resources Information Center

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…

  2. NASA Tech Briefs, August 2000. Volume 24, No. 8

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Topics include: Simulation/Virtual Reality; Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Medical Design.

  3. A Turing Machine Simulator.

    ERIC Educational Resources Information Center

    Navarro, Aaron B.

    1981-01-01

    Presents a program in Level II BASIC for a TRS-80 computer that simulates a Turing machine and discusses the nature of the device. The program is run interactively and is designed to be used as an educational tool by computer science or mathematics students studying computational or automata theory. (MP)

  4. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1986

    1986-01-01

    Describes 26 different activities, experiments, demonstrations, and computer simulations in various topics in science. Includes instructional activities dealing with mural ecology, surface area/volume ratios, energy transfer in ecosystems, electrochemical simulations, alternating and direct current, terminal velocity, measuring the size of the…

  5. Institutional Computing Executive Group Review of Multi-programmatic & Institutional Computing, Fiscal Year 2005 and 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langer, S; Rotman, D; Schwegler, E

    The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less

  6. Games, Simulations and Virtual Labs for Science Education: a Compendium and Some Examples

    NASA Astrophysics Data System (ADS)

    Russell, R. M.

    2012-12-01

    We have assembled a list of computer-based simulations, games, and virtual labs for science education. This list, with links to the sources of these resources, is available online. The entries span a broad range of science, math, and engineering topics. They also span a range of target student ages, from elementary school to university students. We will provide a brief overview of this web site and the resources found on it. We will also briefly demonstrate some of our own educational simulations and games. Computer-based simulations and virtual labs are valuable resources for science educators in various settings, allowing learners to experiment and explore "what if" scenarios. Educational computer games can motivate learners in both formal and informal settings, encouraging them to spend much more time exploring a topic than they might otherwise be inclined to do. Part of this presentation is effectively a "literature review" of numerous sources of simulations, games, and virtual labs. Although we have encountered several nice collections of such resources, those collections seem to be restricted in scope. They either represent materials developed by a specific group or agency (e.g. NOAA's games web site) or are restricted to a specific discipline (e.g. geology simulations and virtual labs). This presentation directs viewers to games, simulations, and virtual labs from many different sources and spanning a broad range of STEM disciplines.

  7. Improving Students’ Science Process Skills through Simple Computer Simulations on Linear Motion Conceptions

    NASA Astrophysics Data System (ADS)

    Siahaan, P.; Suryani, A.; Kaniawati, I.; Suhendi, E.; Samsudin, A.

    2017-02-01

    The purpose of this research is to identify the development of students’ science process skills (SPS) on linear motion concept by utilizing simple computer simulation. In order to simplify the learning process, the concept is able to be divided into three sub-concepts: 1) the definition of motion, 2) the uniform linear motion and 3) the uniformly accelerated motion. This research was administered via pre-experimental method with one group pretest-posttest design. The respondents which were involved in this research were 23 students of seventh grade in one of junior high schools in Bandung City. The improving process of students’ science process skill is examined based on normalized gain analysis from pretest and posttest scores for all sub-concepts. The result of this research shows that students’ science process skills are dramatically improved by 47% (moderate) on observation skill; 43% (moderate) on summarizing skill, 70% (high) on prediction skill, 44% (moderate) on communication skill and 49% (moderate) on classification skill. These results clarify that the utilizing simple computer simulations in physics learning is be able to improve overall science skills at moderate level.

  8. When Feedback Harms and Collaboration Helps in Computer Simulation Environments: An Expertise Reversal Effect

    ERIC Educational Resources Information Center

    Nihalani, Priya K.; Mayrath, Michael; Robinson, Daniel H.

    2011-01-01

    We investigated the effects of feedback and collaboration on undergraduates' transfer performance when using a computer networking training simulation. In Experiment 1, 65 computer science "novices" worked through an instructional protocol individually (control), individually with feedback, or collaboratively with feedback. Unexpectedly,…

  9. Elementary Teachers' Simulation Adoption and Inquiry-Based Use Following Professional Development

    ERIC Educational Resources Information Center

    Gonczi, Amanda; Maeng, Jennifer; Bell, Randy

    2017-01-01

    The purpose of this study was to characterize and compare 64 elementary science teachers' computer simulation use prior to and following professional development (PD) aligned with Innovation Adoption Theory. The PD highlighted computer simulation affordances that elementary teachers might find particularly useful. Qualitative and quantitative…

  10. Toward a Big Data Science: A challenge of "Science Cloud"

    NASA Astrophysics Data System (ADS)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.

  11. Integrating Computational Science Tools into a Thermodynamics Course

    NASA Astrophysics Data System (ADS)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  12. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  13. Enhancing Tele-robotics with Immersive Virtual Reality

    DTIC Science & Technology

    2017-11-03

    graduate and undergraduate students within the Digital Gaming and Simulation, Computer Science, and psychology programs have actively collaborated...investigates the use of artificial intelligence and visual computing. Numerous fields across the human-computer interaction and gaming research areas...invested in digital gaming and simulation to cognitively stimulate humans by computers, forming a $10.5B industry [1]. On the other hand, cognitive

  14. How Science Students Can Learn about Unobservable Phenomena Using Computer-Based Analogies

    ERIC Educational Resources Information Center

    Trey, L.; Khan, S.

    2008-01-01

    A novel instructional computer simulation that incorporates a dynamic analogy to represent Le Chatelier's Principle was designed to investigate the contribution of this feature to students' understanding. Two groups of 12th grade Chemistry students (n=15) interacted with the computer simulation during the study. Both groups did the same…

  15. Optimising the Parallelisation of OpenFOAM Simulations

    DTIC Science & Technology

    2014-06-01

    UNCLASSIFIED UNCLASSIFIED Optimising the Parallelisation of OpenFOAM Simulations Shannon Keough Maritime Division Defence...Science and Technology Organisation DSTO-TR-2987 ABSTRACT The OpenFOAM computational fluid dynamics toolbox allows parallel computation of...performance of a given high performance computing cluster with several OpenFOAM cases, running using a combination of MPI libraries and corresponding MPI

  16. Assessing Practical Skills in Physics Using Computer Simulations

    ERIC Educational Resources Information Center

    Walsh, Kevin

    2018-01-01

    Computer simulations have been used very effectively for many years in the teaching of science but the focus has been on cognitive development. This study, however, is an investigation into the possibility that a student's experimental skills in the real-world environment can be judged via the undertaking of a suitably chosen computer simulation…

  17. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

    ERIC Educational Resources Information Center

    Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

    2012-01-01

    Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

  18. Using Computer Simulations to Integrate Learning.

    ERIC Educational Resources Information Center

    Liao, Thomas T.

    1983-01-01

    Describes the primary design criteria and the classroom activities involved in "The Yellow Light Problem," a minicourse on decision making in the secondary school Mathematics, Engineering and Science Achievement (MESA) program in California. Activities include lectures, discussions, science and math labs, computer labs, and development…

  19. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  20. The Effect of Mode of CAI and Individual Learning Differences on the Understanding of Concept Relationships.

    ERIC Educational Resources Information Center

    Rowland, Paul McD.

    The effect of mode of computer-assisted instruction (CAI) and individual learning differences on the learning of science concepts was investigated. University elementary education majors learned about home energy use from either a computer simulation or a computer tutorial. Learning of science concepts was measured using achievement and…

  1. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  2. Simulations, Games, and Virtual Labs for Science Education: a Compendium and Some Examples

    NASA Astrophysics Data System (ADS)

    Russell, R. M.

    2011-12-01

    We have assembled a list of computer-based simulations, games, and virtual labs for science education. This list, with links to the sources of these resources, is available online. The entries span a broad range of science, math, and engineering topics. They also span a range of target student ages, from elementary school to university students. We will provide a brief overview of this web site and the resources found on it. We will also briefly demonstrate some of our own educational simulations, including the "Very, Very Simple Climate Model", and report on formative evaluations of these resources. Computer-based simulations and virtual labs are valuable resources for science educators in various settings, allowing learners to experiment and explore "what if" scenarios. Educational computer games can motivate learners in both formal and informal settings, encouraging them to spend much more time exploring a topic than they might otherwise be inclined to do. Part of this presentation is effectively a "literature review" of numerous sources of simulations, games, and virtual labs. Although we have encountered several nice collections of such resources, those collections seem to be restricted in scope. They either represent materials developed by a specific group or agency (e.g. NOAA's games web site) or are restricted to a specific discipline (e.g. geology simulations and virtual labs). This presentation directs viewers to games, simulations, and virtual labs from many different sources and spanning a broad range of STEM disciplines.

  3. Design and Application of Interactive Simulations in Problem-Solving in University-Level Physics Education

    ERIC Educational Resources Information Center

    Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel

    2016-01-01

    In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving…

  4. ASCR Workshop on Quantum Computing for Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less

  5. Social Studies: Application Units. Course II, Teachers. Computer-Oriented Curriculum. REACT (Relevant Educational Applications of Computer Technology).

    ERIC Educational Resources Information Center

    Tecnica Education Corp., San Carlos, CA.

    This book is one of a series in Course II of the Relevant Educational Applications of Computer Technology (REACT) Project. It is designed to point out to teachers two of the major applications of computers in the social sciences: simulation and data analysis. The first section contains a variety of simulation units organized under the following…

  6. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  7. A Digital Simulation Program for Health Science Students to Follow Drug Levels in the Body

    ERIC Educational Resources Information Center

    Stavchansky, Salomon; And Others

    1977-01-01

    The Rayetheon Scientific Simulation Language (RSSL) program, an easily-used simulation on the CDC/6600 computer at the University of Texas at Austin, offers a simple method of solving differential equations on a digital computer. It is used by undergraduate biopharmaceutics-pharmacokinetics students and graduate students in all areas. (Author/LBH)

  8. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  9. NASA Tech Briefs, August 1996. Volume 20, No. 8

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics covered include: Graphics and Simulation; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports

  10. Computational Experiments for Science and Engineering Education

    NASA Technical Reports Server (NTRS)

    Xie, Charles

    2011-01-01

    How to integrate simulation-based engineering and science (SBES) into the science curriculum smoothly is a challenging question. For the importance of SBES to be appreciated, the core value of simulations-that they help people understand natural phenomena and solve engineering problems-must be taught. A strategy to achieve this goal is to introduce computational experiments to the science curriculum to replace or supplement textbook illustrations and exercises and to complement or frame hands-on or wet lab experiments. In this way, students will have an opportunity to learn about SBES without compromising other learning goals required by the standards and teachers will welcome these tools as they strengthen what they are already teaching. This paper demonstrates this idea using a number of examples in physics, chemistry, and engineering. These exemplary computational experiments show that it is possible to create a curriculum that is both deeper and wider.

  11. Monte Carlo Analysis of the Commissioning Phase Maneuvers of the Soil Moisture Active Passive (SMAP) Mission

    NASA Technical Reports Server (NTRS)

    Williams, Jessica L.; Bhat, Ramachandra S.; You, Tung-Han

    2012-01-01

    The Soil Moisture Active Passive (SMAP) mission will perform soil moisture content and freeze/thaw state observations from a low-Earth orbit. The observatory is scheduled to launch in October 2014 and will perform observations from a near-polar, frozen, and sun-synchronous Science Orbit for a 3-year data collection mission. At launch, the observatory is delivered to an Injection Orbit that is biased below the Science Orbit; the spacecraft will maneuver to the Science Orbit during the mission Commissioning Phase. The delta V needed to maneuver from the Injection Orbit to the Science Orbit is computed statistically via a Monte Carlo simulation; the 99th percentile delta V (delta V99) is carried as a line item in the mission delta V budget. This paper details the simulation and analysis performed to compute this figure and the delta V99 computed per current mission parameters.

  12. Workflow Management Systems for Molecular Dynamics on Leadership Computers

    NASA Astrophysics Data System (ADS)

    Wells, Jack; Panitkin, Sergey; Oleynik, Danila; Jha, Shantenu

    Molecular Dynamics (MD) simulations play an important role in a range of disciplines from Material Science to Biophysical systems and account for a large fraction of cycles consumed on computing resources. Increasingly science problems require the successful execution of ''many'' MD simulations as opposed to a single MD simulation. There is a need to provide scalable and flexible approaches to the execution of the workload. We present preliminary results on the Titan computer at the Oak Ridge Leadership Computing Facility that demonstrate a general capability to manage workload execution agnostic of a specific MD simulation kernel or execution pattern, and in a manner that integrates disparate grid-based and supercomputing resources. Our results build upon our extensive experience of distributed workload management in the high-energy physics ATLAS project using PanDA (Production and Distributed Analysis System), coupled with recent conceptual advances in our understanding of workload management on heterogeneous resources. We will discuss how we will generalize these initial capabilities towards a more production level service on DOE leadership resources. This research is sponsored by US DOE/ASCR and used resources of the OLCF computing facility.

  13. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  14. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  15. Computational Methods for Predictive Simulation of Stochastic Turbulence Systems

    DTIC Science & Technology

    2015-11-05

    Science and Engineering, Venice , Italy, May 18-20, 2015, pp. 1261-1272. [21] Yong Li and P.D. Williams Analysis of the RAW Filter in Composite-Tendency...leapfrog scheme, Proceedings of the VI Conference on Computational Methods for Coupled Problems in Science and Engineering, Venice , Italy, May 18-20

  16. Simulated and Virtual Science Laboratory Experiments: Improving Critical Thinking and Higher-Order Learning Skills

    ERIC Educational Resources Information Center

    Simon, Nicole A.

    2013-01-01

    Virtual laboratory experiments using interactive computer simulations are not being employed as viable alternatives to laboratory science curriculum at extensive enough rates within higher education. Rote traditional lab experiments are currently the norm and are not addressing inquiry, Critical Thinking, and cognition throughout the laboratory…

  17. Social Choice in a Computer-Assisted Simulation

    ERIC Educational Resources Information Center

    Thavikulwat, Precha

    2009-01-01

    Pursuing a line of inquiry suggested by Crookall, Martin, Saunders, and Coote, the author applied, within the framework of design science, an optimal-design approach to incorporate into a computer-assisted simulation two innovative social choice processes: the multiple period double auction and continuous voting. Expectations that the…

  18. Effect of Inquiry-Based Computer Simulation Modeling on Pre-Service Teachers' Understanding of Homeostasis and Their Perceptions of Design Features

    ERIC Educational Resources Information Center

    Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn

    2016-01-01

    This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…

  19. Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games

    ERIC Educational Resources Information Center

    Nilsson, Elisabet M.; Jakobsson, Anders

    2011-01-01

    The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is…

  20. Today's Business Simulation Industry

    ERIC Educational Resources Information Center

    Summers, Gary J.

    2004-01-01

    New technologies are transforming the business simulation industry. The technologies come from research in computational fields of science, and they endow simulations with new capabilities and qualities. These capabilities and qualities include computerized behavioral simulations, online feedback and coaching, advanced interfaces, learning on…

  1. Computers in Undergraduate Science Education. Conference Proceedings.

    ERIC Educational Resources Information Center

    Blum, Ronald, Ed.

    Six areas of computer use in undergraduate education, particularly in the fields of mathematics and physics, are discussed in these proceedings. The areas included are: the computational mode; computer graphics; the simulation mode; analog computing; computer-assisted instruction; and the current politics and management of college level computer…

  2. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.

    2014-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory. More info available at: scied.ucar.edu/events/agu-2014-games-simulations-sessions

  3. Generic Software for Emulating Multiprocessor Architectures.

    DTIC Science & Technology

    1985-05-01

    RD-A157 662 GENERIC SOFTWARE FOR EMULATING MULTIPROCESSOR 1/2 AlRCHITECTURES(J) MASSACHUSETTS INST OF TECH CAMBRIDGE U LRS LAB FOR COMPUTER SCIENCE R...AREA & WORK UNIT NUMBERS MIT Laboratory for Computer Science 545 Technology Square Cambridge, MA 02139 ____________ I I. CONTROLLING OFFICE NAME AND...aide If neceeasy end Identify by block number) Computer architecture, emulation, simulation, dataf low 20. ABSTRACT (Continue an reverse slde It

  4. Factors Promoting Engaged Exploration with Computer Simulations

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.

    2010-01-01

    This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…

  5. Recovery Act: Web-based CO{sub 2} Subsurface Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paolini, Christopher; Castillo, Jose

    2012-11-30

    The Web-based CO{sub 2} Subsurface Modeling project focused primarily on extending an existing text-only, command-line driven, isothermal and isobaric, geochemical reaction-transport simulation code, developed and donated by Sienna Geodynamics, into an easier-to-use Web-based application for simulating long-term storage of CO{sub 2} in geologic reservoirs. The Web-based interface developed through this project, publically accessible via URL http://symc.sdsu.edu/, enables rapid prototyping of CO{sub 2} injection scenarios and allows students without advanced knowledge of geochemistry to setup a typical sequestration scenario, invoke a simulation, analyze results, and then vary one or more problem parameters and quickly re-run a simulation to answer what-if questions.more » symc.sdsu.edu has 2x12 core AMD Opteron™ 6174 2.20GHz processors and 16GB RAM. The Web-based application was used to develop a new computational science course at San Diego State University, COMP 670: Numerical Simulation of CO{sub 2} Sequestration, which was taught during the fall semester of 2012. The purpose of the class was to introduce graduate students to Carbon Capture, Use and Storage (CCUS) through numerical modeling and simulation, and to teach students how to interpret simulation results to make predictions about long-term CO{sub 2} storage capacity in deep brine reservoirs. In addition to the training and education component of the project, significant software development efforts took place. Two computational science doctoral and one geological science masters student, under the direction of the PIs, extended the original code developed by Sienna Geodynamics, named Sym.8. New capabilities were added to Sym.8 to simulate non-isothermal and non-isobaric flows of charged aqueous solutes in porous media, in addition to incorporating HPC support into the code for execution on many-core XSEDE clusters. A successful outcome of this project was the funding and training of three new computational science students and one geological science student in technologies relevant to carbon sequestration and problems involving flow in subsurface media. The three computational science students are currently finishing their doctorial studies on different aspects of modeling CO{sub 2} sequestration, while the geological science student completed his master’s thesis in modeling the thermal response of CO{sub 2} injection in brine and, as a direct result of participation in this project, is now employed at ExxonMobil as a full-time staff geologist.« less

  6. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    DTIC Science & Technology

    2007-04-01

    judgmental self-doubt, depression, and causal uncertainty, tend to take fewer risks, and have lower self-esteem. Results from two studies (Nygren, 2000...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 1869 Assessment of Two Desk-Top Computer Simulations Used to...SUBTITLE 5a. CONTRACT OR GRANT NUMBER Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit

  7. Inquiry Based-Computational Experiment, Acquisition of Threshold Concepts and Argumentation in Science and Mathematics Education

    ERIC Educational Resources Information Center

    Psycharis, Sarantos

    2016-01-01

    Computational experiment approach considers models as the fundamental instructional units of Inquiry Based Science and Mathematics Education (IBSE) and STEM Education, where the model take the place of the "classical" experimental set-up and simulation replaces the experiment. Argumentation in IBSE and STEM education is related to the…

  8. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.

  9. Aneesur Rahman Prize Talk

    NASA Astrophysics Data System (ADS)

    Frenkel, Daan

    2007-03-01

    During the past decade there has been a unique synergy between theory, experiment and simulation in Soft Matter Physics. In colloid science, computer simulations that started out as studies of highly simplified model systems, have acquired direct experimental relevance because experimental realizations of these simple models can now be synthesized. Whilst many numerical predictions concerning the phase behavior of colloidal systems have been vindicated by experiments, the jury is still out on others. In my talk I will discuss some of the recent technical developments, new findings and open questions in computational soft-matter science.

  10. The science of rotator cuff tears: translating animal models to clinical recommendations using simulation analysis.

    PubMed

    Mannava, Sandeep; Plate, Johannes F; Tuohy, Christopher J; Seyler, Thorsten M; Whitlock, Patrick W; Curl, Walton W; Smith, Thomas L; Saul, Katherine R

    2013-07-01

    The purpose of this article is to review basic science studies using various animal models for rotator cuff research and to describe structural, biomechanical, and functional changes to muscle following rotator cuff tears. The use of computational simulations to translate the findings from animal models to human scale is further detailed. A comprehensive review was performed of the basic science literature describing the use of animal models and simulation analysis to examine muscle function following rotator cuff injury and repair in the ageing population. The findings from various studies of rotator cuff pathology emphasize the importance of preventing permanent muscular changes with detrimental results. In vivo muscle function, electromyography, and passive muscle-tendon unit properties were studied before and after supraspinatus tenotomy in a rodent rotator cuff injury model (acute vs chronic). Then, a series of simulation experiments were conducted using a validated computational human musculoskeletal shoulder model to assess both passive and active tension of rotator cuff repairs based on surgical positioning. Outcomes of rotator cuff repair may be improved by earlier surgical intervention, with lower surgical repair tensions and fewer electromyographic neuromuscular changes. An integrated approach of animal experiments, computer simulation analyses, and clinical studies may allow us to gain a fundamental understanding of the underlying pathology and interpret the results for clinical translation.

  11. NASA Center for Climate Simulation (NCCS) Advanced Technology AT5 Virtualized Infiniband Report

    NASA Technical Reports Server (NTRS)

    Thompson, John H.; Bledsoe, Benjamin C.; Wagner, Mark; Shakshober, John; Fromkin, Russ

    2013-01-01

    The NCCS is part of the Computational and Information Sciences and Technology Office (CISTO) of Goddard Space Flight Center's (GSFC) Sciences and Exploration Directorate. The NCCS's mission is to enable scientists to increase their understanding of the Earth, the solar system, and the universe by supplying state-of-the-art high performance computing (HPC) solutions. To accomplish this mission, the NCCS (https://www.nccs.nasa.gov) provides high performance compute engines, mass storage, and network solutions to meet the specialized needs of the Earth and space science user communities

  12. Role of High-End Computing in Meeting NASA's Science and Engineering Challenges

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak

    2006-01-01

    High-End Computing (HEC) has always played a major role in meeting the modeling and simulation needs of various NASA missions. With NASA's newest 62 teraflops Columbia supercomputer, HEC is having an even greater impact within the Agency and beyond. Significant cutting-edge science and engineering simulations in the areas of space exploration, Shuttle operations, Earth sciences, and aeronautics research, are already occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. The talk will describe how the integrated supercomputing production environment is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions.

  13. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.; Clark, S.

    2015-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.

  14. Programming Video Games and Simulations in Science Education: Exploring Computational Thinking through Code Analysis

    ERIC Educational Resources Information Center

    Garneli, Varvara; Chorianopoulos, Konstantinos

    2018-01-01

    Various aspects of computational thinking (CT) could be supported by educational contexts such as simulations and video-games construction. In this field study, potential differences in student motivation and learning were empirically examined through students' code. For this purpose, we performed a teaching intervention that took place over five…

  15. Application of Psychological Theories in Agent-Based Modeling: The Case of the Theory of Planned Behavior.

    PubMed

    Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo

    2018-01-01

    It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.

  16. Progress in fast, accurate multi-scale climate simulations

    DOE PAGES

    Collins, W. D.; Johansen, H.; Evans, K. J.; ...

    2015-06-01

    We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  17. The Viability of Distance Education Science Laboratories.

    ERIC Educational Resources Information Center

    Forinash, Kyle; Wisman, Raymond

    2001-01-01

    Discusses the effectiveness of offering science laboratories via distance education. Explains current delivery technologies, including computer simulations, videos, and laboratory kits sent to students; pros and cons of distance labs; the use of spreadsheets; and possibilities for new science education models. (LRW)

  18. A Science Cloud: OneSpaceNet

    NASA Astrophysics Data System (ADS)

    Morikawa, Y.; Murata, K. T.; Watari, S.; Kato, H.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Shimojo, S.

    2010-12-01

    Main methodologies of Solar-Terrestrial Physics (STP) so far are theoretical, experimental and observational, and computer simulation approaches. Recently "informatics" is expected as a new (fourth) approach to the STP studies. Informatics is a methodology to analyze large-scale data (observation data and computer simulation data) to obtain new findings using a variety of data processing techniques. At NICT (National Institute of Information and Communications Technology, Japan) we are now developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment specialized for science works, which connects many researchers with high-speed network (JGN: Japan Gigabit Network). The JGN is a wide-area back-born network operated by NICT; it provides 10G network and many access points (AP) over Japan. The OneSpaceNet also provides with rich computer resources for research studies, such as super-computers, large-scale data storage area, licensed applications, visualization devices (like tiled display wall: TDW), database/DBMS, cluster computers (4-8 nodes) for data processing and communication devices. What is amazing in use of the science cloud is that a user simply prepares a terminal (low-cost PC). Once connecting the PC to JGN2plus, the user can make full use of the rich resources of the science cloud. Using communication devices, such as video-conference system, streaming and reflector servers, and media-players, the users on the OneSpaceNet can make research communications as if they belong to a same (one) laboratory: they are members of a virtual laboratory. The specification of the computer resources on the OneSpaceNet is as follows: The size of data storage we have developed so far is almost 1PB. The number of the data files managed on the cloud storage is getting larger and now more than 40,000,000. What is notable is that the disks forming the large-scale storage are distributed to 5 data centers over Japan (but the storage system performs as one disk). There are three supercomputers allocated on the cloud, one from Tokyo, one from Osaka and the other from Nagoya. One's simulation job data on any supercomputers are saved on the cloud data storage (same directory); it is a kind of virtual computing environment. The tiled display wall has 36 panels acting as one display; the pixel (resolution) size of it is as large as 18000x4300. This size is enough to preview or analyze the large-scale computer simulation data. It also allows us to take a look of multiple (e.g., 100 pictures) on one screen together with many researchers. In our talk we also present a brief report of the initial results using the OneSpaceNet for Global MHD simulations as an example of successful use of our science cloud; (i) Ultra-high time resolution visualization of Global MHD simulations on the large-scale storage and parallel processing system on the cloud, (ii) Database of real-time Global MHD simulation and statistic analyses of the data, and (iii) 3D Web service of Global MHD simulations.

  19. DOE Advanced Scientific Advisory Committee (ASCAC): Workforce Subcommittee Letter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Barbara; Calandra, Henri; Crivelli, Silvia

    2014-07-23

    Simulation and computing are essential to much of the research conducted at the DOE national laboratories. Experts in the ASCR ¬relevant Computing Sciences, which encompass a range of disciplines including Computer Science, Applied Mathematics, Statistics and domain Computational Sciences, are an essential element of the workforce in nearly all of the DOE national laboratories. This report seeks to identify the gaps and challenges facing DOE with respect to this workforce. This letter is ASCAC’s response to the charge of February 19, 2014 to identify disciplines in which significantly greater emphasis in workforce training at the graduate or postdoctoral levels ismore » necessary to address workforce gaps in current and future Office of Science mission needs.« less

  20. NASA Tech Briefs, August 1995. Volume 19, No. 8

    NASA Technical Reports Server (NTRS)

    1995-01-01

    There is a special focus on computer graphics and simulation in this issue. Topics covered include : Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer programs, Mechanics; Machinery; Fabrication Technology; and Mathematics and Information Sciences. There is a section on for Laser Technology, which includes a feature on Moving closer to the suns power.

  1. Complex Systems Simulation and Optimization | Computational Science | NREL

    Science.gov Websites

    account. Stochastic Optimization and Control: Formulation and implementation of advanced optimization and account uncertainty. Contact Wesley Jones Group Manager, Complex Systems Simulation and Optimiziation

  2. NASA Tech Briefs, August 1997. Volume 21, No. 8

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Topics:Graphics and Simulation; Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.

  3. An Atom Is Known by the Company It Keeps: A Constructionist Learning Environment for Materials Science Using Agent-Based Modeling

    ERIC Educational Resources Information Center

    Blikstein, Paulo; Wilensky, Uri

    2009-01-01

    This article reports on "MaterialSim", an undergraduate-level computational materials science set of constructionist activities which we have developed and tested in classrooms. We investigate: (a) the cognition of students engaging in scientific inquiry through interacting with simulations; (b) the effects of students programming simulations as…

  4. Development of Science Simulations for Mildly Mentally Retarded or Learning Disabled Students. Final Report.

    ERIC Educational Resources Information Center

    Macro Systems, Inc., Silver Spring, MD.

    This final report describes the development of eight computer based science simulations designed for use with middle school mainstreamed students having learning disabilities or mild mental retardation. The total program includes software, a teacher's manual, 3 videos, and a set of 30 activity worksheets. Special features of the software for…

  5. Examination of the Effects of Dimensionality on Cognitive Processing in Science: A Computational Modeling Experiment Comparing Online Laboratory Simulations and Serious Educational Games

    ERIC Educational Resources Information Center

    Lamb, Richard L.

    2016-01-01

    Within the last 10 years, new tools for assisting in the teaching and learning of academic skills and content within the context of science have arisen. These new tools include multiple types of computer software and hardware to include (video) games. The purpose of this study was to examine and compare the effect of computer learning games in the…

  6. Ethical sensitivity intervention in science teacher education: Using computer simulations and professional codes of ethics

    NASA Astrophysics Data System (ADS)

    Holmes, Shawn Yvette

    A simulation was created to emulate two Racial Ethical Sensitivity Test (REST) videos (Brabeck et al., 2000). The REST is a reliable assessment for ethical sensitivity to racial and gender intolerant behaviors in educational settings. Quantitative and qualitative analysis of the REST was performed using the Quick-REST survey and an interview protocol. The purpose of this study was to affect science educator ability to recognize instances of racial and gender intolerant behaviors by levering immersive qualities of simulations. The fictitious Hazelton High School virtual environment was created by the researcher and compared with the traditional REST. The study investigated whether computer simulations can influence the ethical sensitivity of preservice and inservice science teachers to racial and gender intolerant behaviors in school settings. The post-test only research design involved 32 third-year science education students enrolled in science education classes at several southeastern universities and 31 science teachers from the same locale, some of which were part of an NSF project. Participant samples were assigned to the video control group or the simulation experimental group. This resulted in four comparison group; preservice video, preservice simulation, inservice video and inservice simulation. Participants experienced two REST scenarios in the appropriate format then responded to Quick-REST survey questions for both scenarios. Additionally, the simulation groups answered in-simulation and post-simulation questions. Nonparametric analysis of the Quick-REST ascertained differences between comparison groups. Cronbach's alpha was calculated for internal consistency. The REST interview protocol was used to analyze recognition of intolerant behaviors in the in-simulation prompts. Post-simulation prompts were analyzed for emergent themes concerning effect of the simulation on responses. The preservice video group had a significantly higher mean rank score than other comparison groups. There were no significant differences across the remaining groups. Qualitative analyses of in-simulation prompts suggest both preservice and inservice participants are unlikely to take action in an intolerant environment. Themes emerged in the post-simulation responses indicated participants viewed the simulation as a reflective, interactive, personal, and organic environment.

  7. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  8. Snowmass Computing Frontier: Computing for the Cosmic Frontier, Astrophysics, and Cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connolly, A.; Habib, S.; Szalay, A.

    2013-11-12

    This document presents (off-line) computing requrements and challenges for Cosmic Frontier science, covering the areas of data management, analysis, and simulations. We invite contributions to extend the range of covered topics and to enhance the current descriptions.

  9. Computer literacy for life sciences: helping the digital-era biology undergraduates face today's research.

    PubMed

    Smolinski, Tomasz G

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009.

  10. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  11. Simulating Serious Games: A Discrete-Time Computational Model Based on Cognitive Flow Theory

    ERIC Educational Resources Information Center

    Westera, Wim

    2018-01-01

    This paper presents a computational model for simulating how people learn from serious games. While avoiding the combinatorial explosion of a games micro-states, the model offers a meso-level pathfinding approach, which is guided by cognitive flow theory and various concepts from learning sciences. It extends a basic, existing model by exposing…

  12. NASA Advanced Supercomputing Facility Expansion

    NASA Technical Reports Server (NTRS)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  13. Research approaches to mass casualty incidents response: development from routine perspectives to complexity science.

    PubMed

    Shen, Weifeng; Jiang, Libing; Zhang, Mao; Ma, Yuefeng; Jiang, Guanyu; He, Xiaojun

    2014-01-01

    To review the research methods of mass casualty incident (MCI) systematically and introduce the concept and characteristics of complexity science and artificial system, computational experiments and parallel execution (ACP) method. We searched PubMed, Web of Knowledge, China Wanfang and China Biology Medicine (CBM) databases for relevant studies. Searches were performed without year or language restrictions and used the combinations of the following key words: "mass casualty incident", "MCI", "research method", "complexity science", "ACP", "approach", "science", "model", "system" and "response". Articles were searched using the above keywords and only those involving the research methods of mass casualty incident (MCI) were enrolled. Research methods of MCI have increased markedly over the past few decades. For now, dominating research methods of MCI are theory-based approach, empirical approach, evidence-based science, mathematical modeling and computer simulation, simulation experiment, experimental methods, scenario approach and complexity science. This article provides an overview of the development of research methodology for MCI. The progresses of routine research approaches and complexity science are briefly presented in this paper. Furthermore, the authors conclude that the reductionism underlying the exact science is not suitable for MCI complex systems. And the only feasible alternative is complexity science. Finally, this summary is followed by a review that ACP method combining artificial systems, computational experiments and parallel execution provides a new idea to address researches for complex MCI.

  14. Perspectives on the Future of CFD

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2000-01-01

    This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.

  15. Learning Science through Computer Games and Simulations

    ERIC Educational Resources Information Center

    Honey, Margaret A., Ed.; Hilton, Margaret, Ed.

    2011-01-01

    At a time when scientific and technological competence is vital to the nation's future, the weak performance of U.S. students in science reflects the uneven quality of current science education. Although young children come to school with innate curiosity and intuitive ideas about the world around them, science classes rarely tap this potential.…

  16. Using Educational Games and Simulation Software in a Computer Science Course: Learning Achievements and Student Flow Experiences

    ERIC Educational Resources Information Center

    Liu, Tsung-Yu

    2016-01-01

    This study investigates how educational games impact on students' academic performance and multimedia flow experiences in a computer science course. A curriculum consists of five basic learning units, that is, the stack, queue, sort, tree traversal, and binary search tree, was conducted for 110 university students during one semester. Two groups…

  17. BeeSign: A Computationally-Mediated Intervention to Examine K-1 Students' Representational Activities in the Context of Teaching Complex Systems Concepts

    ERIC Educational Resources Information Center

    Danish, Joshua Adam

    2009-01-01

    Representations such as drawings, graphs, and computer simulations, are central to learning and doing science. Furthermore, ongoing success in science learning requires students to build on the representations and associated practices that they are presumed to have learned throughout their schooling career. Without these practices, students have…

  18. Using the Power of Media to Communicate Science: A Question of Style?

    ERIC Educational Resources Information Center

    Imhof, Heidi

    1991-01-01

    Discusses educational effects of the style, content, and quality inherent in several multimedia and desktop-publishing products available to science teachers, including books, interactive software, videos, and computer simulations. (JJK)

  19. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  20. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.

    2013-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by education groups at NCAR/UCAR in Boulder, primarily Spark and the COMET Program. These materials have been disseminated via Spark's web site (spark.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility. Spark has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.

  1. Ultracold-atom quantum simulator for attosecond science

    NASA Astrophysics Data System (ADS)

    Sala, Simon; Förster, Johann; Saenz, Alejandro

    2017-01-01

    A quantum simulator based on ultracold optically trapped atoms for simulating the physics of atoms and molecules in ultrashort intense laser fields is introduced. The slowing down by about 13 orders of magnitude allows one to watch in slow motion the tunneling and recollision processes that form the heart of attosecond science. The extreme flexibility of the simulator promises a deeper understanding of strong-field physics, especially for many-body systems beyond the reach of classical computers. The quantum simulator can experimentally straightforwardly be realized and is shown to recover the ionization characteristics of atoms in the different regimes of laser-matter interaction.

  2. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  3. Design and Use Online Platforms to Learn Mathematics and the Use of Them in Simulations of Problems in Applied Sciences

    ERIC Educational Resources Information Center

    Méndez-Fragoso, Ricardo; Villavicencio-Torres, Mirna; Martínez-Moreno, Josué

    2017-01-01

    In this contribution, we show the practical use of the computer to visualise simple computational simulations to show phenomena that occur in everyday life, or require an abstract understanding for being unintuitive phenomena. The relationship of the mathematics to different scientific disciplines motivates us to devise different treatments to…

  4. The Effect of Interactive, Three Dimensional, High Speed Simulations on High School Science Students' Conceptions of the Molecular Structure of Water.

    ERIC Educational Resources Information Center

    Hakerem, Gita; And Others

    The Water and Molecular Networks (WAMNet) Project uses graduate student written Reduced Instruction Set Computing (RISC) computer simulations of the molecular structure of water to assist high school students learn about the nature of water. This study examined: (1) preconceptions concerning the molecular structure of water common among high…

  5. Using and Evaluating Resampling Simulations in SPSS and Excel.

    ERIC Educational Resources Information Center

    Smith, Brad

    2003-01-01

    Describes and evaluates three computer-assisted simulations used with Statistical Package for the Social Sciences (SPSS) and Microsoft Excel. Designed the simulations to reinforce and enhance student understanding of sampling distributions, confidence intervals, and significance tests. Reports evaluations revealed improved student comprehension of…

  6. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  7. Chaos in the Classroom.

    ERIC Educational Resources Information Center

    Jackett, Dwane

    1990-01-01

    Described is a science activity which illustrates the principle of uncertainty using a computer simulation of bacterial reproduction. Procedures and results are discussed. Several illustrations of results are provided. The availability of a computer program is noted. (CW)

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Santanu; Dang, Liem X.

    In this paper, we present the first computer simulation of methanol exchange dynamics between the first and second solvation shells around different cations and anions. After water, methanol is the most frequently used solvent for ions. Methanol has different structural and dynamical properties than water, so its ion solvation process is different. To this end, we performed molecular dynamics simulations using polarizable potential models to describe methanol-methanol and ion-methanol interactions. In particular, we computed methanol exchange rates by employing the transition state theory, the Impey-Madden-McDonald method, the reactive flux approach, and the Grote-Hynes theory. We observed that methanol exchange occursmore » at a nanosecond time scale for Na+ and at a picosecond time scale for other ions. We also observed a trend in which, for like charges, the exchange rate is slower for smaller ions because they are more strongly bound to methanol. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dang, Liem X.; Vo, Quynh N.; Nilsson, Mikael

    We report one of the first simulations using a classical rate theory approach to predict the mechanism of the exchange process between water and aqueous uranyl ions. Using our water and ion-water polarizable force fields and molecular dynamics techniques, we computed the potentials of mean force for the uranyl ion-water pair as the function of pressures at ambient temperature. Subsequently, these simulated potentials of mean force were used to calculate rate constants using the transition rate theory; the time dependent transmission coefficients were also examined using the reactive flux method and Grote-Hynes treatments of the dynamic response of the solvent.more » The computed activation volumes using transition rate theory and the corrected rate constants are positive, thus the mechanism of this particular water-exchange is a dissociative process. We discuss our rate theory results and compare them with previously studies in which non-polarizable force fields were used. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less

  10. A View of the Science Education Research Literature: Scientific Discovery Learning with Computer Simulations.

    ERIC Educational Resources Information Center

    Robinson, William R.

    2000-01-01

    Describes a review of research that addresses the effectiveness of simulations in promoting scientific discovery learning and the problems that learners may encounter when using discovery learning. (WRM)

  11. BASIC Simulation Programs; Volumes I and II. Biology, Earth Science, Chemistry.

    ERIC Educational Resources Information Center

    Digital Equipment Corp., Maynard, MA.

    Computer programs which teach concepts and processes related to biology, earth science, and chemistry are presented. The seven biology problems deal with aspects of genetics, evolution and natural selection, gametogenesis, enzymes, photosynthesis, and the transport of material across a membrane. Four earth science problems concern climates, the…

  12. Integrating an Intelligent Tutoring System for TAOs with Second Life

    DTIC Science & Technology

    2010-12-01

    SL) and interacts with a number of computer -controlled objects that take on the roles of the TAO’s teammates. TAOs rely on the same mechanism to...projects that utilize both game and simulation technology for training. He joined Stottler Henke in the fall of 2000 and holds a Ph.D. in computer science...including implementing tutors in multiuser worlds. He has been at Stottler Henke since 2005 and has a MS in computer science from Stanford University

  13. Expanding capacity and promoting inclusion in introductory computer science: a focus on near-peer mentor preparation and code review

    NASA Astrophysics Data System (ADS)

    Pon-Barry, Heather; Packard, Becky Wai-Ling; St. John, Audrey

    2017-01-01

    A dilemma within computer science departments is developing sustainable ways to expand capacity within introductory computer science courses while remaining committed to inclusive practices. Training near-peer mentors for peer code review is one solution. This paper describes the preparation of near-peer mentors for their role, with a focus on regular, consistent feedback via peer code review and inclusive pedagogy. Introductory computer science students provided consistently high ratings of the peer mentors' knowledge, approachability, and flexibility, and credited peer mentor meetings for their strengthened self-efficacy and understanding. Peer mentors noted the value of videotaped simulations with reflection, discussions of inclusion, and the cohort's weekly practicum for improving practice. Adaptations of peer mentoring for different types of institutions are discussed. Computer science educators, with hopes of improving the recruitment and retention of underrepresented groups, can benefit from expanding their peer support infrastructure and improving the quality of peer mentor preparation.

  14. A Case Study for Comparing the Effectiveness of a Computer Simulation and a Hands-on Activity on Learning Electric Circuits

    ERIC Educational Resources Information Center

    Ekmekci, Adem; Gulacar, Ozcan

    2015-01-01

    Science education reform emphasizes innovative and constructivist views of science teaching and learning that promotes active learning environments, dynamic instructions, and authentic science experiments. Technology-based and hands-on instructional designs are among innovative science teaching and learning methods. Research shows that these two…

  15. The Effect of Simulation-Assisted Laboratory Applications on Pre-Service Teachers' Attitudes towards Science Teaching

    ERIC Educational Resources Information Center

    Ulukök, Seyma; Sari, Ugur

    2016-01-01

    In this study, the effects of computer-assisted laboratory applications on pre-service science teachers' attitudes towards science teaching were investigated and the opinions of the pre-service teachers about the application were also determined. The study sample consisted of 46 students studying science teaching Faculty of Education. The study…

  16. Understanding Islamist political violence through computational social simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less

  17. A Determination of the Minimum Frequency Requirements for a PATRIOT Battalion UHF Communication System.

    DTIC Science & Technology

    1982-12-01

    a computer program which simulates the PATRIOT battalion UH1F communication system. *.-.The detailed description of how the model performs this...the Degree of Master of Science .AI . j tf ti on-i by 5 , .... . :it Lard/or Gregory H. Swanson DLt Captain USA Graduate Computer Science I...5 Model Application..... . . . .. .. . . .. .. . . 6 Thesnis Overviev ....... o.000000000000000000000. .6 Previous Studies

  18. Interactive visualization of Earth and Space Science computations

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  19. Knowledge Discovery from Climate Data using Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Steinhaeuser, K.

    2012-04-01

    Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.

  20. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    ERIC Educational Resources Information Center

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  1. Warriors Edge Simulation and Gaming System: The Squad Simulation

    DTIC Science & Technology

    2005-08-01

    Warriors Edge Simulation and Gaming System: The Squad Simulation by Mark Thomas and Gary Moss ARL-TR-3564 August 2005...Edge Simulation and Gaming System: The Squad Simulation Mark Thomas and Gary Moss Computational and Information Sciences Directorate, ARL...2004–30 September 2004 5a. CONTRACT NUMBER 5b. GRANT NUMBER 4. TITLE AND SUBTITLE Warriors Edge Simulation and Gaming System: The Squad

  2. Verifying a computational method for predicting extreme ground motion

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  3. Scientific Discovery through Advanced Computing in Plasma Science

    NASA Astrophysics Data System (ADS)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.

  4. Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    2008-05-01

    The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.

  5. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  6. 78 FR 20666 - Food and Drug Administration/National Institutes of Health/National Science Foundation Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-05

    ... (CDRH) believes that computer modeling and simulation (M&S) has the potential to substantially augment... simulate multiple use conditions and to visualize and display complex processes and data can revolutionize...

  7. Using Physical and Computer Simulations of Collective Behaviour as an Introduction to Modelling Concepts for Applied Biologists

    ERIC Educational Resources Information Center

    Rands, Sean A.

    2012-01-01

    Models are an important tool in science: not only do they act as a convenient device for describing a system or problem, but they also act as a conceptual tool for framing and exploring hypotheses. Models, and in particular computer simulations, are also an important education tool for training scientists, but it is difficult to teach students the…

  8. 2011 Computation Directorate Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less

  9. Modeling Mendel's Laws on Inheritance in Computational Biology and Medical Sciences

    ERIC Educational Resources Information Center

    Singh, Gurmukh; Siddiqui, Khalid; Singh, Mankiran; Singh, Satpal

    2011-01-01

    The current research article is based on a simple and practical way of employing the computational power of widely available, versatile software MS Excel 2007 to perform interactive computer simulations for undergraduate/graduate students in biology, biochemistry, biophysics, microbiology, medicine in college and university classroom setting. To…

  10. Building Cognition: The Construction of Computational Representations for Scientific Discovery

    ERIC Educational Resources Information Center

    Chandrasekharan, Sanjay; Nersessian, Nancy J.

    2015-01-01

    Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a…

  11. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron; Shank, James; Ernst, Michael

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. Wemore » operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.« less

  12. Introducing Computational Thinking through Hands-on Projects Using R with Applications to Calculus, Probability and Data Analysis

    ERIC Educational Resources Information Center

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-01-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data…

  13. Using Computer Simulations for Promoting Model-Based Reasoning: Epistemological and Educational Dimensions

    ERIC Educational Resources Information Center

    Develaki, Maria

    2017-01-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and…

  14. Army Maneuver Center of Excellence

    DTIC Science & Technology

    2012-10-18

    agreements throughout DoD DARPA, JIEDDO, DHS, FAA, DoE, NSA , NASA, SMDC, etc. Strategic Partnerships Benefit the Army Materiel Enterprise External... Neuroscience Network Sciences Hierarchical Computing Extreme Energy Science Autonomous Systems Technology Emerging Sciences Meso-scale (grain...scales • Improvements in Soldier-system overall performance → operational neuroscience and advanced simulation and training technologies

  15. The advanced role of computational mechanics and visualization in science and technology: analysis of the Germanwings Flight 9525 crash

    NASA Astrophysics Data System (ADS)

    Chen, Goong; Wang, Yi-Ching; Perronnet, Alain; Gu, Cong; Yao, Pengfei; Bin-Mohsin, Bandar; Hajaiej, Hichem; Scully, Marlan O.

    2017-03-01

    Computational mathematics, physics and engineering form a major constituent of modern computational science, which now stands on an equal footing with the established branches of theoretical and experimental sciences. Computational mechanics solves problems in science and engineering based upon mathematical modeling and computing, bypassing the need for expensive and time-consuming laboratory setups and experimental measurements. Furthermore, it allows the numerical simulations of large scale systems, such as the formation of galaxies that could not be done in any earth bound laboratories. This article is written as part of the 21st Century Frontiers Series to illustrate some state-of-the-art computational science. We emphasize how to do numerical modeling and visualization in the study of a contemporary event, the pulverizing crash of the Germanwings Flight 9525 on March 24, 2015, as a showcase. Such numerical modeling and the ensuing simulation of aircraft crashes into land or mountain are complex tasks as they involve both theoretical study and supercomputing of a complex physical system. The most tragic type of crash involves ‘pulverization’ such as the one suffered by this Germanwings flight. Here, we show pulverizing airliner crashes by visualization through video animations from supercomputer applications of the numerical modeling tool LS-DYNA. A sound validation process is challenging but essential for any sophisticated calculations. We achieve this by validation against the experimental data from a crash test done in 1993 of an F4 Phantom II fighter jet into a wall. We have developed a method by hybridizing two primary methods: finite element analysis and smoothed particle hydrodynamics. This hybrid method also enhances visualization by showing a ‘debris cloud’. Based on our supercomputer simulations and the visualization, we point out that prior works on this topic based on ‘hollow interior’ modeling can be quite problematic and, thus, not likely to be correct. We discuss the effects of terrain on pulverization using the information from the recovered flight-data-recorder and show our forensics and assessments of what may have happened during the final moments of the crash. Finally, we point out that our study has potential for being made into real-time flight crash simulators to help the study of crashworthiness and survivability for future aviation safety. Some forward-looking statements are also made.

  16. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1985

    1985-01-01

    Presents 23 experiments, demonstrations, activities, and computer programs in biology, chemistry, and physics. Topics include lead in petrol, production of organic chemicals, reduction of water, enthalpy, X-ray diffraction model, nuclear magnetic resonance spectroscopy, computer simulation for additive mixing of colors, Archimedes Principle, and…

  17. Learning with STEM Simulations in the Classroom: Findings and Trends from a Meta-Analysis

    ERIC Educational Resources Information Center

    D'Angelo, Cynthia M.; Rutstein, Daisy; Harris, Christopher J.

    2016-01-01

    This article presents a summary of the findings of a systematic review and meta-analysis of the literature on computer-based interactive simulations for K-12 science, technology, engineering, and mathematics (STEM) learning topics. For achievement outcomes, simulations had a moderate to strong effect on student learning. Overall, simulations have…

  18. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  19. Ab initio density-functional calculations in materials science: from quasicrystals over microporous catalysts to spintronics.

    PubMed

    Hafner, Jürgen

    2010-09-29

    During the last 20 years computer simulations based on a quantum-mechanical description of the interactions between electrons and atomic nuclei have developed an increasingly important impact on materials science, not only in promoting a deeper understanding of the fundamental physical phenomena, but also enabling the computer-assisted design of materials for future technologies. The backbone of atomic-scale computational materials science is density-functional theory (DFT) which allows us to cast the intractable complexity of electron-electron interactions into the form of an effective single-particle equation determined by the exchange-correlation functional. Progress in DFT-based calculations of the properties of materials and of simulations of processes in materials depends on: (1) the development of improved exchange-correlation functionals and advanced post-DFT methods and their implementation in highly efficient computer codes, (2) the development of methods allowing us to bridge the gaps in the temperature, pressure, time and length scales between the ab initio calculations and real-world experiments and (3) the extension of the functionality of these codes, permitting us to treat additional properties and new processes. In this paper we discuss the current status of techniques for performing quantum-based simulations on materials and present some illustrative examples of applications to complex quasiperiodic alloys, cluster-support interactions in microporous acid catalysts and magnetic nanostructures.

  20. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less

  1. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  2. Advances in Cross-Cutting Ideas for Computational Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

  3. Advances in Cross-Cutting Ideas for Computational Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, E.; Evans, K.; Caldwell, P.

    This report presents results from the DOE-sponsored workshop titled, Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

  4. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  5. Employing inquiry-based computer simulations and embedded scientist videos to teach challenging climate change and nature of science concepts

    NASA Astrophysics Data System (ADS)

    Cohen, Edward Charles

    Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known as Web-based Inquiry Science Environment (WISE). For this research, students from a suburban, diverse, middle school setting use the simulations as part of a two week-long class unit on climate change. A pilot study was conducted during phase one of the research that informed phase two, which encompasses the dissertation. During the pilot study, as students worked through the simulation, evidence of shifts in student motivation, understanding of science content, and ideas about the nature of science became present using a combination of student interviews, focus groups, and students' conversations. Outcomes of the pilot study included improvements to the pedagogical approach. Allowing students to do "Extreme Testing" (e.g., making the world as hot or cold as possible) and increasing the time for free exploration of the simulation are improvements made as a result of the findings of the pilot study. In the dissertation (phase two of the research design) these findings were implemented in a new curriculum scaled for 85 new students from the same school during the next school year. The modifications included new components implementing simulations as an assessment tool for all students and embedded modeling tools. All students were asked to build pre and post models, however due to technological constraints these were not an effective tool. A non-video group of 44 students was established and another group of 41 video students had a WISE curriculum which included twelve minutes of scientists' conversational videos referencing explicit aspects on the nature of science, specifically the use of models and simulations in science. The students in the video group had marked improvement compared to the non-video group on questions regarding modeling as a tool for representing objects and processes of science modeling aspects as evident by multiple data sources. The findings from the dissertation have potential impacts on improving Nature of Science (NOS) concepts around modeling by efficiently embedding short authentic scientific videos that can be easily used by many educators. Compared to published assessments by the American Association for the Advancement of Science (AAAS), due to the curriculum interventions both groups scored higher than the average United States middle school student on many NOS and climate content constructs.

  6. Employing Inquiry-Based Computer Simulations and Embedded Scientist Videos To Teach Challenging Climate Change and Nature of Science Concepts

    NASA Astrophysics Data System (ADS)

    Cohen, E.

    2013-12-01

    Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known as Web-based Inquiry Science Environment (WISE). For this research, students from a suburban, diverse, middle school setting use the simulations as part of a two week-long class unit on climate change. A pilot study was conducted during phase one of the research that informed phase two, which encompasses the dissertation. During the pilot study, as students worked through the simulation, evidence of shifts in student motivation, understanding of science content, and ideas about the nature of science became present using a combination of student interviews, focus groups, and students' conversations. Outcomes of the pilot study included improvements to the pedagogical approach. Allowing students to do 'Extreme Testing' (e.g., making the world as hot or cold as possible) and increasing the time for free exploration of the simulation are improvements made as a result of the findings of the pilot study. In the dissertation (phase two of the research design) these findings were implemented in a new curriculum scaled for 85 new students from the same school during the next school year. The modifications included new components implementing simulations as an assessment tool for all students and embedded modeling tools. All students were asked to build pre and post models, however due to technological constraints these were not an effective tool. A non-video group of 44 students was established and another group of 41 video students had a WISE curriculum which included twelve minutes of scientists' conversational videos referencing explicit aspects on the nature of science, specifically the use of models and simulations in science. The students in the video group had marked improvement compared to the non-video group on questions regarding modeling as a tool for representing objects and processes of science modeling aspects as evident by multiple data sources. The findings from the dissertation have potential impacts on improving Nature of Science (NOS) concepts around modeling by efficiently embedding short authentic scientific videos that can be easily used by many educators. Compared to published assessments by the American Association for the Advancement of Science (AAAS), due to the curriculum interventions both groups scored higher than the average United States middle school student on many NOS and climate content constructs.

  7. Teaching Real Science with a Microcomputer.

    ERIC Educational Resources Information Center

    Naiman, Adeline

    1983-01-01

    Discusses various ways science can be taught using microcomputers, including simulations/games which allow large-scale or historic experiments to be replicated on a manageable scale in a brief time. Examples of several computer programs are also presented, including "Experiments in Human Physiology,""Health Awareness…

  8. Educational aspects of molecular simulation

    NASA Astrophysics Data System (ADS)

    Allen, Michael P.

    This article addresses some aspects of teaching simulation methods to undergraduates and graduate students. Simulation is increasingly a cross-disciplinary activity, which means that the students who need to learn about simulation methods may have widely differing backgrounds. Also, they may have a wide range of views on what constitutes an interesting application of simulation methods. Almost always, a successful simulation course includes an element of practical, hands-on activity: a balance always needs to be struck between treating the simulation software as a 'black box', and becoming bogged down in programming issues. With notebook computers becoming widely available, students often wish to take away the programs to run themselves, and access to raw computer power is not the limiting factor that it once was; on the other hand, the software should be portable and, if possible, free. Examples will be drawn from the author's experience in three different contexts. (1) An annual simulation summer school for graduate students, run by the UK CCP5 organization, in which practical sessions are combined with an intensive programme of lectures describing the methodology. (2) A molecular modelling module, given as part of a doctoral training centre in the Life Sciences at Warwick, for students who might not have a first degree in the physical sciences. (3) An undergraduate module in Physics at Warwick, also taken by students from other disciplines, teaching high performance computing, visualization, and scripting in the context of a physical application such as Monte Carlo simulation.

  9. Computer Simulation in the Social Sciences/Social Studies.

    ERIC Educational Resources Information Center

    Klassen, Daniel L.

    Computers are beginning to be used more frequently as instructional tools in secondary school social studies. This is especially true of "new social studies" programs; i.e., programs which subordinate mere mastery of factual content to the recognition of and ability to deal with the social imperatives of the future. Computer-assisted…

  10. A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates

    ERIC Educational Resources Information Center

    Ozturk, Ali Osman

    2012-01-01

    This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…

  11. Learning the Rules of the Game: The Nature of Game and Classroom Supports When Using a Concept-Integrated Digital Physics Game in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Stewart, Phillip Michael, Jr.

    2013-01-01

    Games in science education is emerging as a popular topic of scholarly inquiry. The National Research Council recently published a report detailing a research agenda for games and science education entitled "Learning Science Through Computer Games and Simulations" (2011). The report recommends moving beyond typical proof-of-concept…

  12. Increasing complexity with quantum physics.

    PubMed

    Anders, Janet; Wiesner, Karoline

    2011-09-01

    We argue that complex systems science and the rules of quantum physics are intricately related. We discuss a range of quantum phenomena, such as cryptography, computation and quantum phases, and the rules responsible for their complexity. We identify correlations as a central concept connecting quantum information and complex systems science. We present two examples for the power of correlations: using quantum resources to simulate the correlations of a stochastic process and to implement a classically impossible computational task.

  13. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  14. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  15. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  16. Situating Computer Simulation Professional Development: Does It Promote Inquiry-Based Simulation Use?

    ERIC Educational Resources Information Center

    Gonczi, Amanda L.; Maeng, Jennifer L.; Bell, Randy L.; Whitworth, Brooke A.

    2016-01-01

    This mixed-methods study sought to identify professional development implementation variables that may influence participant (a) adoption of simulations, and (b) use for inquiry-based science instruction. Two groups (Cohort 1, N = 52; Cohort 2, N = 104) received different professional development. Cohort 1 was focused on Web site use mechanics.…

  17. Students as Simulation Designers and Developers--Using Computer Simulations for Teaching Boundary Layer Processes.

    ERIC Educational Resources Information Center

    Johnson, Tristan E.; Clayson, Carol Anne

    As technology developments seek to improve learning, researchers, developers, and educators seek to understand how technological properties impact performance. This paper delineates how a traditional science course is enhanced through the use of simulation projects directed by the students themselves as a means to increase their level of knowledge…

  18. Advanced Training Techniques Using Computer Generated Imagery.

    DTIC Science & Technology

    1981-09-15

    Annual Technical Report for Period- 16 May 1980 - 15 July 1981 LJ Prepared for AIR FORCE OFFICE OF SCIENTIFIC RESEARCH Director of Life Sciences Building...Simulation Management Branch, ATC, Randolph AFB, TX 78148, November 1977. Allbee, K. F., Semple C. A.; Aircrew Training Devices Life Cycle Cost and Worth...in Simulator Design and Application, Life Sciences, Inc., 227 Lood 820 NE, Hurst, Texas 76053, AFOSR-TR-77- 0965, 30 September 1976 McDonnell Aircraft

  19. Optimized Materials From First Principles Simulations: Are We There Yet?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galli, G; Gygi, F

    2005-07-26

    In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less

  20. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  1. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  2. A Generic Simulation Framework for Non-Entangled based Experimental Quantum Cryptography and Communication: Quantum Cryptography and Communication Simulator (QuCCs)

    NASA Astrophysics Data System (ADS)

    Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad

    2016-11-01

    The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.

  3. To simulate or not to simulate: what are the questions?

    PubMed

    Dudai, Yadin; Evers, Kathinka

    2014-10-22

    Simulation is a powerful method in science and engineering. However, simulation is an umbrella term, and its meaning and goals differ among disciplines. Rapid advances in neuroscience and computing draw increasing attention to large-scale brain simulations. What is the meaning of simulation, and what should the method expect to achieve? We discuss the concept of simulation from an integrated scientific and philosophical vantage point and pinpoint selected issues that are specific to brain simulation.

  4. The Real-World Connection.

    ERIC Educational Resources Information Center

    Estes, Charles R.

    1994-01-01

    Discusses theoretical versus applied science and the use of the scientific method for analysis of social issues. Topics addressed include the use of simulation and modeling; the growth in computer power, including nanotechnology; distributed computing; self-evolving programs; spiritual matters; human engineering, i.e., molding individuals;…

  5. Science. [SITE 2001 Section].

    ERIC Educational Resources Information Center

    Roach, Linda E., Ed.

    This document contains the following papers on science from the SITE (Society for Information Technology & Teacher Education) 2001 conference: (1) "Using a Computer Simulation before Dissection To Help Students Learn Anatomy" (Joseph Paul Akpan and Thomas Andre); (2) "EARTH2CLASS: A Unique Workshop/On-Line/Distance-Learning…

  6. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  7. First principles calculations of thermal conductivity with out of equilibrium molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Puligheddu, Marcello; Gygi, Francois; Galli, Giulia

    The prediction of the thermal properties of solids and liquids is central to numerous problems in condensed matter physics and materials science, including the study of thermal management of opto-electronic and energy conversion devices. We present a method to compute the thermal conductivity of solids by performing ab initio molecular dynamics at non equilibrium conditions. Our formulation is based on a generalization of the approach to equilibrium technique, using sinusoidal temperature gradients, and it only requires calculations of first principles trajectories and atomic forces. We discuss results and computational requirements for a representative, simple oxide, MgO, and compare with experiments and data obtained with classical potentials. This work was supported by MICCoM as part of the Computational Materials Science Program funded by the U.S. Department of Energy (DOE), Office of Science , Basic Energy Sciences (BES), Materials Sciences and Engineering Division under Grant DOE/BES 5J-30.

  8. Biological and Environmental Research Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Biological and Environmental Research, March 28-31, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arkin, Adam; Bader, David C.; Coffey, Richard

    Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less

  9. New Editions for the Apple II of the Chelsea Science Simulations.

    ERIC Educational Resources Information Center

    Pipeline, 1983

    1983-01-01

    Ten computer simulations for the Apple II are described. Subject areas of programs include: population dynamics, plant competition, enzyme kinetics, evolution and natural selection, genetic mapping, ammonia synthesis, reaction kinetics, wave interference/diffraction, satellite orbits, and particle scattering. (JN)

  10. Evolving Storage and Cyber Infrastructure at the NASA Center for Climate Simulation

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen; Duffy, Daniel; Spear, Carrie; Sinno, Scott; Vaughan, Garrison; Bowen, Michael

    2018-01-01

    This talk will describe recent developments at the NASA Center for Climate Simulation, which is funded by NASAs Science Mission Directorate, and supports the specialized data storage and computational needs of weather, ocean, and climate researchers, as well as astrophysicists, heliophysicists, and planetary scientists. To meet requirements for higher-resolution, higher-fidelity simulations, the NCCS augments its High Performance Computing (HPC) and storage retrieval environment. As the petabytes of model and observational data grow, the NCCS is broadening data services offerings and deploying and expanding virtualization resources for high performance analytics.

  11. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  12. Fusion Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Fusion Energy Sciences, January 27-29, 2016, Gaithersburg, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Choong-Seock; Greenwald, Martin; Riley, Katherine

    The additional computing power offered by the planned exascale facilities could be transformational across the spectrum of plasma and fusion research — provided that the new architectures can be efficiently applied to our problem space. The collaboration that will be required to succeed should be viewed as an opportunity to identify and exploit cross-disciplinary synergies. To assess the opportunities and requirements as part of the development of an overall strategy for computing in the exascale era, the Exascale Requirements Review meeting of the Fusion Energy Sciences (FES) community was convened January 27–29, 2016, with participation from a broad range ofmore » fusion and plasma scientists, specialists in applied mathematics and computer science, and representatives from the U.S. Department of Energy (DOE) and its major computing facilities. This report is a summary of that meeting and the preparatory activities for it and includes a wealth of detail to support the findings. Technical opportunities, requirements, and challenges are detailed in this report (and in the recent report on the Workshop on Integrated Simulation). Science applications are described, along with mathematical and computational enabling technologies. Also see http://exascaleage.org/fes/ for more information.« less

  13. A novel medical image data-based multi-physics simulation platform for computational life sciences.

    PubMed

    Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels

    2013-04-06

    Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.

  14. A Visual Tool for Computer Supported Learning: The Robot Motion Planning Example

    ERIC Educational Resources Information Center

    Elnagar, Ashraf; Lulu, Leena

    2007-01-01

    We introduce an effective computer aided learning visual tool (CALVT) to teach graph-based applications. We present the robot motion planning problem as an example of such applications. The proposed tool can be used to simulate and/or further to implement practical systems in different areas of computer science such as graphics, computational…

  15. Mars Science Laboratory Workstation Test Set

    NASA Technical Reports Server (NTRS)

    Henriquez, David A.; Canham, Timothy K.; Chang, Johnny T.; Villaume, Nathaniel

    2009-01-01

    The Mars Science Laboratory developed the Workstation TestSet (WSTS) is a computer program that enables flight software development on virtual MSL avionics. The WSTS is the non-real-time flight avionics simulator that is designed to be completely software-based and run on a workstation class Linux PC.

  16. Dynamic Interactions for Network Visualization and Simulation

    DTIC Science & Technology

    2009-03-01

    projects.htm, Site accessed January 5, 2009. 12. John S. Weir, Major, USAF, Mediated User-Simulator Interactive Command with Visualization ( MUSIC -V). Master’s...Computing Sciences in Colleges, December 2005). 14. Enrique Campos -Nanez, “nscript user manual,” Department of System Engineer- ing University of

  17. Interactive Heat Transfer Simulations for Everyone

    ERIC Educational Resources Information Center

    Xie, Charles

    2012-01-01

    Heat transfer is widely taught in secondary Earth science and physics. Researchers have identified many misconceptions related to heat and temperature. These misconceptions primarily stem from hunches developed in everyday life (though the confusions in terminology often worsen them). Interactive computer simulations that visualize thermal energy,…

  18. One of My Favorite Assignments: Automated Teller Machine Simulation.

    ERIC Educational Resources Information Center

    Oberman, Paul S.

    2001-01-01

    Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)

  19. Computer Science Techniques Applied to Parallel Atomistic Simulation

    NASA Astrophysics Data System (ADS)

    Nakano, Aiichiro

    1998-03-01

    Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.

  20. Advances in free-energy-based simulations of protein folding and ligand binding.

    PubMed

    Perez, Alberto; Morrone, Joseph A; Simmerling, Carlos; Dill, Ken A

    2016-02-01

    Free-energy-based simulations are increasingly providing the narratives about the structures, dynamics and biological mechanisms that constitute the fabric of protein science. Here, we review two recent successes. It is becoming practical: first, to fold small proteins with free-energy methods without knowing substructures and second, to compute ligand-protein binding affinities, not just their binding poses. Over the past 40 years, the timescales that can be simulated by atomistic MD are doubling every 1.3 years--which is faster than Moore's law. Thus, these advances are not simply due to the availability of faster computers. Force fields, solvation models and simulation methodology have kept pace with computing advancements, and are now quite good. At the tip of the spear recently are GPU-based computing, improved fast-solvation methods, continued advances in force fields, and conformational sampling methods that harness external information. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A Survey of Immersive Technology For Maintenance Evaluations

    DTIC Science & Technology

    1998-04-01

    image display system. Based on original work performed at the German National Computer Science and Mathematics Research Institute (GMD), and further...simulations, architectural walk- throughs, medical simulations, general research , entertainment applications and location based entertainment use...simulations. This study was conducted as part of a logistics research and development program Design Evaluation for Personnel, Training, and Human Factors

  2. Fostering Elementary School Students' Understanding of Simple Electricity by Combining Simulation and Laboratory Activities

    ERIC Educational Resources Information Center

    Jaakkola, T.; Nurmi, S.

    2008-01-01

    Computer simulations and laboratory activities have been traditionally treated as substitute or competing methods in science teaching. The aim of this experimental study was to investigate if it would be more beneficial to combine simulation and laboratory activities than to use them separately in teaching the concepts of simple electricity. Based…

  3. Games, Simulations, and Visual Metaphors in Education: Antagonism between Enjoyment and Learning

    ERIC Educational Resources Information Center

    Rieber, Lloyd P.; Noah, David

    2008-01-01

    The purpose of this study was to investigate the influence of game-like activities on adult learning during a computer-based simulation. This research also studied the use of visual metaphors as graphic organizers to help make the underlying science principles explicit without interfering with the interactive nature of the simulation. A total of…

  4. Multimillion Atom Simulations and Visualization of Hypervelocity Impact Damage and Oxidation

    DTIC Science & Technology

    2004-01-01

    MULTIMILLION ATOM SIMULATIONS AND VISUALIZATION OF HYPERVELOCITY IMPACT DAMAGE AND OXIDATION Priya Vashishta*, Rajiv K. Kalia, and Aiichiro Nakano...number. 1. REPORT DATE 2. REPORT TYPE 3. DATES COVERED 00 DEC 2004 N/A 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Multimillion Atom Simulations And...Collaboratory for Advanced Computing and Simulations Department of Materials Science & Engineering, Department of Physics & Astronomy, Department of

  5. StarLogo TNG

    NASA Astrophysics Data System (ADS)

    Klopfer, Eric; Scheintaub, Hal; Huang, Wendy; Wendel, Daniel

    Computational approaches to science are radically altering the nature of scientific investigatiogn. Yet these computer programs and simulations are sparsely used in science education, and when they are used, they are typically “canned” simulations which are black boxes to students. StarLogo The Next Generation (TNG) was developed to make programming of simulations more accessible for students and teachers. StarLogo TNG builds on the StarLogo tradition of agent-based modeling for students and teachers, with the added features of a graphical programming environment and a three-dimensional (3D) world. The graphical programming environment reduces the learning curve of programming, especially syntax. The 3D graphics make for a more immersive and engaging experience for students, including making it easy to design and program their own video games. Another change to StarLogo TNG is a fundamental restructuring of the virtual machine to make it more transparent. As a result of these changes, classroom use of TNG is expanding to new areas. This chapter is concluded with a description of field tests conducted in middle and high school science classes.

  6. FOREWORD: Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology and Mathematics

    NASA Astrophysics Data System (ADS)

    Kaski, K.; Salomaa, M.

    1990-01-01

    These are Proceedings of the Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology, and Mathematics, held August 25-26, 1989, at Lahti (Finland). The Symposium belongs to an annual series of Meetings, the first one of which was arranged in 1987 at Lund (Sweden) and the second one in 1988 at Kolle-Kolle near Copenhagen (Denmark). Although these Symposia have thus far been essentially Nordic events, their international character has increased significantly; the trend is vividly reflected through contributions in the present Topical Issue. The interdisciplinary nature of Computational Science is central to the activity; this fundamental aspect is also responsible, in an essential way, for its rapidly increasing impact. Crucially important to a wide spectrum of superficially disparate fields is the common need for extensive - and often quite demanding - computational modelling. For such theoretical models, no closed-form (analytical) solutions are available or they would be extremely difficult to find; hence one must rather resort to the Art of performing computational investigations. Among the unifying features in the computational research are the methods of simulation employed; methods which frequently are quite closely related with each other even for faculties of science that are quite unrelated. Computer simulation in Natural Sciences is presently apprehended as a discipline on its own right, occupying a broad region somewhere between the experimental and theoretical methods, but also partially overlapping with and complementing them. - Whichever its proper definition may be, the computational approach serves as a novel and an extremely versatile tool with which one can equally well perform "pure" experimental modelling and conduct "computational theory". Computational studies that have earlier been made possible only through supercomputers have opened unexpected, as well as exciting, novel frontiers equally in mathematics (e.g., fractals), physics (fluid-dynamical and quantum-mechanical calculations; extensive numerical simulations of various condensed-matter systems; the development of stellar constellations, even the early Universe), chemistry (quantum-chemical calculations on the structures of new chemical compounds; chemical reactions and reaction dynamics), and biology (various models, for example, in population dynamics). We succeeded in our effort to assemble several internationally recognized researchers of Computational Science to deliver invited talks on a couple of exceptionally beautiful late-summer days in the modern premises of the Adult Education Center at Lahti. Among the plenary speakers, Per Bak described his highly original work on self-organized criticality. David Ceperley discussed pioneering numerical simulations of superfluid helium in which, for the first time, Feynman's path-integral formulation of quantum mechanics has been implemented on a computer. Jim Gunton presented his comprehensive studies of the Cahn-Hilliard equation for the dynamics of ordering in a condensed-matter system far from equilibrium, while Alex Hansen explained those on nonlinear breakdown in disordered materials. Representing the important field of computational chemistry, Bo Jönsson dealt with attractive forces between polyelectrolytes. Kurt Kremer gave an interesting account on computer-simulation studies of complex polymer systems, while Ole Mouritsen reviewed studies of interfacial fluctuations in lipid membranes. Pekka Pyykkö introduced his pioneering work which has led to predictions of completely novel chemical species. Annette Zippelius gave an expert introduction to the highly active field of neural networks. It is evident from each of these intriguing plenary contributions that, indeed, the computational approach is a frontier field of science, possibly providing the most versatile research method available today. We also arranged a competition for the best Posters presented at the Symposium; the Prizes were some of the newest books on the beauty of fractals. The First Prize was won by Hanna Viertio, the Second Prize by Miguel Zendejas and the Third Prize was shared by Leo Kärkkäinen and Kari Rummukainen. As for the future of Computational Science, we identify two principal avenues: (a) big science - large centers with ultrafast supercomputers, and (b) small science - active groups utilizing personal minisupercomputers or supenvorkstations. At present, it appears that the latter already compete extremely favourably in their performance with the massive supercomputers - at least in their throughput and, especially, in tasks where a broad range of diverse software support is not absolutely necessary. In view of this important emergence of "personal supercomputing", we envisage that the role and the development of large computer centers will have to be reviewed critically and modified accordingly. Furthermore, a promise for some radically new approaches to Computational Science could be provided by massively parallel computers; among them, maybe solutions based on ideas of neural computing could be utilized, especially for restricted applications. Therefore, in order not to overlook any important advances within such a forefront field, one should rather choose the strategy of actively following each and every one of these routes. In perspective of the large variety of simultaneous developments, we want to emphasize the importance of Nordic collaboration in sharing expertise and experience in the rapidly progressing research - it ought to be cultivated and could be expanded. Therefore, we think that it is vitally important to continue with and to further promote the kind of Nordic Symposia that have been held at Lund, Kolle-Kolle, and Lahti. We want to thank most cordially the plenary and invited speakers, contributors, students, and in particular the Conference Secretary, Ms Ulla Ahlfors and Dr Milja Mäkelä, who was responsible for the local arrangements. The work that they did served to make this Symposium a scientific success and a useful and pleasant experience for all the well over 100 participants. We also thank the City of Lahti for kindly arranging a refreshing reception at the Town Hall. We wish to express our gratitude to Nordiska Kulturfonden, NORDITA, the Research Institute for Theoretical Physics at the University of Helsinki, the Finnish Ministry of Education and the Academy of Finland for their financial support. March 1990

  7. Modelling the spread of innovation in wild birds.

    PubMed

    Shultz, Thomas R; Montrey, Marcel; Aplin, Lucy M

    2017-06-01

    We apply three plausible algorithms in agent-based computer simulations to recent experiments on social learning in wild birds. Although some of the phenomena are simulated by all three learning algorithms, several manifestations of social conformity bias are simulated by only the approximate majority (AM) algorithm, which has roots in chemistry, molecular biology and theoretical computer science. The simulations generate testable predictions and provide several explanatory insights into the diffusion of innovation through a population. The AM algorithm's success raises the possibility of its usefulness in studying group dynamics more generally, in several different scientific domains. Our differential-equation model matches simulation results and provides mathematical insights into the dynamics of these algorithms. © 2017 The Author(s).

  8. Integrating interactive computational modeling in biology curricula.

    PubMed

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  9. Recent achievements in real-time computational seismology in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liang, W.; Huang, B.

    2012-12-01

    Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information < 120 sec; ROS completes a 3D simulation < 3 minutes). All of these computational results are posted on the internet in real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

  10. Complex systems and health behavior change: insights from cognitive science.

    PubMed

    Orr, Mark G; Plaut, David C

    2014-05-01

    To provide proof-of-concept that quantum health behavior can be instantiated as a computational model that is informed by cognitive science, the Theory of Reasoned Action, and quantum health behavior theory. We conducted a synthetic review of the intersection of quantum health behavior change and cognitive science. We conducted simulations, using a computational model of quantum health behavior (a constraint satisfaction artificial neural network) and tested whether the model exhibited quantum-like behavior. The model exhibited clear signs of quantum-like behavior. Quantum health behavior can be conceptualized as constraint satisfaction: a mitigation between current behavioral state and the social contexts in which it operates. We outlined implications for moving forward with computational models of both quantum health behavior and health behavior in general.

  11. Petascale Kinetic Simulations in Space Sciences: New Simulations and Data Discovery Techniques and Physics Results

    NASA Astrophysics Data System (ADS)

    Karimabadi, Homa

    2012-03-01

    Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.

  12. Report from the MPP Working Group to the NASA Associate Administrator for Space Science and Applications

    NASA Technical Reports Server (NTRS)

    Fischer, James R.; Grosch, Chester; Mcanulty, Michael; Odonnell, John; Storey, Owen

    1987-01-01

    NASA's Office of Space Science and Applications (OSSA) gave a select group of scientists the opportunity to test and implement their computational algorithms on the Massively Parallel Processor (MPP) located at Goddard Space Flight Center, beginning in late 1985. One year later, the Working Group presented its report, which addressed the following: algorithms, programming languages, architecture, programming environments, the way theory relates, and performance measured. The findings point to a number of demonstrated computational techniques for which the MPP architecture is ideally suited. For example, besides executing much faster on the MPP than on conventional computers, systolic VLSI simulation (where distances are short), lattice simulation, neural network simulation, and image problems were found to be easier to program on the MPP's architecture than on a CYBER 205 or even a VAX. The report also makes technical recommendations covering all aspects of MPP use, and recommendations concerning the future of the MPP and machines based on similar architectures, expansion of the Working Group, and study of the role of future parallel processors for space station, EOS, and the Great Observatories era.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William D; Johansen, Hans; Evans, Katherine J

    We present a survey of physical and computational techniques that have the potential to con- tribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy andmore » fidelity in simulation of dynamics and allow more complete representations of climate features at the global scale. At the same time, part- nerships with computer science teams have focused on taking advantage of evolving computer architectures, such as many-core processors and GPUs, so that these approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  14. Parallel Computing for Brain Simulation.

    PubMed

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. Computer Series, 60: Bits and Pieces, 23.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1985-01-01

    Describes: (1) an interactive computer simulation for a science fair display of chromatography inks; (2) analytical chemistry programs; (3) microcomputer-assisted drills in organic synthesis; (4) programs for conformation analysis of ethane and butane; (5) MOLPIX--a program for generating and displaying molecular structures; and (6) chemical…

  16. Physics Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1983

    1983-01-01

    Discusses the Rugby clock as a source of project material, use of ZX81 for experimental science, computer dice analog, oil recovery from reservoirs, and computer simulation of Thompson's experiment for determining e/m for an electron. Activities/procedures are provided when applicable. Also presents questions (and answers) related to time-coded…

  17. Bernoulli's Principle: Science as a Human Endeavor

    ERIC Educational Resources Information Center

    McCarthy, Deborah

    2008-01-01

    What do the ideas of Daniel Bernoulli--an 18th-century Swiss mathematician, physicist, natural scientist, and professor--and your students' next landing of the space shuttle via computer simulation have in common? Because of his contribution, referred in physical science as Bernoulli's principle, modern flight is possible. The mini learning-cycle…

  18. Simulation and Collaborative Learning in Political Science and Sociology Classrooms.

    ERIC Educational Resources Information Center

    Peters, Sandra; Saxon, Deborah

    The program described here used cooperative, content-based computer writing projects to teach Japanese students at an intermediate level of English proficiency enrolled in first-year, English-language courses in political science/environmental issues and sociology/environmental issues in an international college program. The approach was taken to…

  19. NREL Receives Editors' Choice Awards for Supercomputer Research | News |

    Science.gov Websites

    function," Beckham said. "We followed up these molecular simulations with experimental work to Award. The awards recognize outstanding research in computational molecular science and engineering Mechanisms of Cellulose-Active Enzymes Using Molecular Simulation" at the AIChE 2014 Annual Meeting

  20. 2005 White Paper on Institutional Capability Computing Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B; McCoy, M; Seager, M

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less

  1. Proposal for grid computing for nuclear applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.

    2014-02-12

    The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.

  2. Final Report: A Broad Research Project on the Sciences of Complexity, September 15, 1994 - November 15, 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2000-02-01

    DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.

  3. Effects of Computer-Based Simulations Teaching Approach on Students' Achievement in the Learning of Chemistry among Secondary School Students in Nakuru Sub County, Kenya

    ERIC Educational Resources Information Center

    Mihindo, W. Jane; Wachanga, S.W.; Anditi, Z. O.

    2017-01-01

    Science education should help develop student's interest in science as today's society depends largely on output of science and technology. Chemistry is one of the branches of science. Chemistry education helps to expand the pupil's knowledge of the universe and of his/her position in it. It helps in the appreciation and enjoyment of nature and…

  4. Toward an in-situ analytics and diagnostics framework for earth system models

    NASA Astrophysics Data System (ADS)

    Anantharaj, Valentine; Wolf, Matthew; Rasch, Philip; Klasky, Scott; Williams, Dean; Jacob, Rob; Ma, Po-Lun; Kuo, Kwo-Sen

    2017-04-01

    The development roadmaps for many earth system models (ESM) aim for a globally cloud-resolving model targeting the pre-exascale and exascale systems of the future. The ESMs will also incorporate more complex physics, chemistry and biology - thereby vastly increasing the fidelity of the information content simulated by the model. We will then be faced with an unprecedented volume of simulation output that would need to be processed and analyzed concurrently in order to derive the valuable scientific results. We are already at this threshold with our current generation of ESMs at higher resolution simulations. Currently, the nominal I/O throughput in the Community Earth System Model (CESM) via Parallel IO (PIO) library is around 100 MB/s. If we look at the high frequency I/O requirements, it would require an additional 1 GB / simulated hour, translating to roughly 4 mins wallclock / simulated-day => 24.33 wallclock hours / simulated-model-year => 1,752,000 core-hours of charge per simulated-model-year on the Titan supercomputer at the Oak Ridge Leadership Computing Facility. There is also a pending need for 3X more volume of simulation output . Meanwhile, many ESMs use instrument simulators to run forward models to compare model simulations against satellite and ground-based instruments, such as radars and radiometers. The CFMIP Observation Simulator Package (COSP) is used in CESM as well as the Accelerated Climate Model for Energy (ACME), one of the ESMs specifically targeting current and emerging leadership-class computing platforms These simulators can be computationally expensive, accounting for as much as 30% of the computational cost. Hence the data are often written to output files that are then used for offline calculations. Again, the I/O bottleneck becomes a limitation. Detection and attribution studies also use large volume of data for pattern recognition and feature extraction to analyze weather and climate phenomenon such as tropical cyclones, atmospheric rivers, blizzards, etc. It is evident that ESMs need an in-situ framework to decouple the diagnostics and analytics from the prognostics and physics computations of the models so that the diagnostic computations could be performed concurrently without limiting model throughput. We are designing a science-driven online analytics framework for earth system models. Our approach is to adopt several data workflow technologies, such as the Adaptable IO System (ADIOS), being developed under the U.S. Exascale Computing Project (ECP) and integrate these to allow for extreme performance IO, in situ workflow integration, science-driven analytics and visualization all in a easy to use computational framework. This will allow science teams to write data 100-1000 times faster and seamlessly move from post processing the output for validation and verification purposes to performing these calculations in situ. We can easily and knowledgeably envision a near-term future where earth system models like ACME and CESM will have to address not only the challenges of the volume of data but also need to consider the velocity of the data. The earth system model of the future in the exascale era, as they incorporate more complex physics at higher resolutions, will be able to analyze more simulation content without having to compromise targeted model throughput.

  5. Ontological and Epistemological Issues Regarding Climate Models and Computer Experiments

    NASA Astrophysics Data System (ADS)

    Vezer, M. A.

    2010-12-01

    Recent philosophical discussions (Parker 2009; Frigg and Reiss 2009; Winsberg, 2009; Morgon 2002, 2003, 2005; Gula 2002) about the ontology of computer simulation experiments and the epistemology of inferences drawn from them are of particular relevance to climate science as computer modeling and analysis are instrumental in understanding climatic systems. How do computer simulation experiments compare with traditional experiments? Is there an ontological difference between these two methods of inquiry? Are there epistemological considerations that result in one type of inference being more reliable than the other? What are the implications of these questions with respect to climate studies that rely on computer simulation analysis? In this paper, I examine these philosophical questions within the context of climate science, instantiating concerns in the philosophical literature with examples found in analysis of global climate change. I concentrate on Wendy Parker’s (2009) account of computer simulation studies, which offers a treatment of these and other questions relevant to investigations of climate change involving such modelling. Two theses at the center of Parker’s account will be the focus of this paper. The first is that computer simulation experiments ought to be regarded as straightforward material experiments; which is to say, there is no significant ontological difference between computer and traditional experimentation. Parker’s second thesis is that some of the emphasis on the epistemological importance of materiality has been misplaced. I examine both of these claims. First, I inquire as to whether viewing computer and traditional experiments as ontologically similar in the way she does implies that there is no proper distinction between abstract experiments (such as ‘thought experiments’ as well as computer experiments) and traditional ‘concrete’ ones. Second, I examine the notion of materiality (i.e., the material commonality between object and target systems) and some arguments for the claim that materiality entails some inferential advantage to traditional experimentation. I maintain that Parker’s account of the ontology of computer simulations has some interesting though potentially problematic implications regarding conventional distinctions between abstract and concrete methods of inquiry. With respect to her account of materiality, I outline and defend an alternative account, posited by Mary Morgan (2002, 2003, 2005), which holds that ontological similarity between target and object systems confers some epistemological advantage to traditional forms of experimental inquiry.

  6. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  7. A Hybrid Cloud Computing Service for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, C. P.

    2016-12-01

    Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.

  8. Computational Science: A Research Methodology for the 21st Century

    NASA Astrophysics Data System (ADS)

    Orbach, Raymond L.

    2004-03-01

    Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.

  9. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  10. Computer Modeling of the Effects of Atmospheric Conditions on Sound Signatures

    DTIC Science & Technology

    2016-02-01

    simulation. 11 5. References 1. Attenborough K. Sound propagation in the atmosphere. In: Rossing TD, editor. Springer handbook of...ARL-TR-7602 ● FEB 2016 US Army Research Laboratory Computer Modeling of the Effects of Atmospheric Conditions on Sound ...Laboratory Computer Modeling of the Effects of Atmospheric Conditions on Sound Signatures by Sarah Wagner Science and Engineering Apprentice

  11. Addressing the challenges of standalone multi-core simulations in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Ocaya, R. O.; Terblans, J. J.

    2017-07-01

    Computational modelling in material science involves mathematical abstractions of force fields between particles with the aim to postulate, develop and understand materials by simulation. The aggregated pairwise interactions of the material's particles lead to a deduction of its macroscopic behaviours. For practically meaningful macroscopic scales, a large amount of data are generated, leading to vast execution times. Simulation times of hours, days or weeks for moderately sized problems are not uncommon. The reduction of simulation times, improved result accuracy and the associated software and hardware engineering challenges are the main motivations for many of the ongoing researches in the computational sciences. This contribution is concerned mainly with simulations that can be done on a "standalone" computer based on Message Passing Interfaces (MPI), parallel code running on hardware platforms with wide specifications, such as single/multi- processor, multi-core machines with minimal reconfiguration for upward scaling of computational power. The widely available, documented and standardized MPI library provides this functionality through the MPI_Comm_size (), MPI_Comm_rank () and MPI_Reduce () functions. A survey of the literature shows that relatively little is written with respect to the efficient extraction of the inherent computational power in a cluster. In this work, we discuss the main avenues available to tap into this extra power without compromising computational accuracy. We also present methods to overcome the high inertia encountered in single-node-based computational molecular dynamics. We begin by surveying the current state of the art and discuss what it takes to achieve parallelism, efficiency and enhanced computational accuracy through program threads and message passing interfaces. Several code illustrations are given. The pros and cons of writing raw code as opposed to using heuristic, third-party code are also discussed. The growing trend towards graphical processor units and virtual computing clouds for high-performance computing is also discussed. Finally, we present the comparative results of vacancy formation energy calculations using our own parallelized standalone code called Verlet-Stormer velocity (VSV) operating on 30,000 copper atoms. The code is based on the Sutton-Chen implementation of the Finnis-Sinclair pairwise embedded atom potential. A link to the code is also given.

  12. Particle-in-cell code library for numerical simulation of the ECR source plasma

    NASA Astrophysics Data System (ADS)

    Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.

    2003-05-01

    The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.

  13. A Summary of Proceedings for the Advanced Deployable Day/Night Simulation Symposium

    DTIC Science & Technology

    2009-07-01

    initiated to design , develop, and deliver transportable visual simulations that jointly provide night-vision and high-resolution daylight capability. The...Deployable Day/Night Simulation (ADDNS) Technology Demonstration Project was initiated to design , develop, and deliver transportable visual...was Dr. Richard Wildes (York University); Mr. Vitaly Zholudev (Department of Computer Science, York University), Mr. X. Zhu (Neptec Design Group), and

  14. Biennial Conference on Chemical Education: Abstracts of Papers (9th, Bozeman, Montana, July 27-August 2, 1986).

    ERIC Educational Resources Information Center

    1986

    This document includes summaries of conference presentations dealing with a wide variety of topics, including chemistry units for the elementary classroom, science experimentation in the secondary school, computer simulations, computer interfaces, videodisc technology, correspondence teaching of general chemistry, interdisciplinary energy courses,…

  15. How Computer-Assisted Teaching in Physics Can Enhance Student Learning

    ERIC Educational Resources Information Center

    Karamustafaoglu, O.

    2012-01-01

    Simple harmonic motion (SHM) is an important topic for physics or science students and has wide applications all over the world. Computer simulations are applications of special interest in physics teaching because they support powerful modeling environments involving physics concepts. This article is aimed to compare the effect of…

  16. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    ERIC Educational Resources Information Center

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  17. Computer Graphics and Physics Teaching.

    ERIC Educational Resources Information Center

    Bork, Alfred M.; Ballard, Richard

    New, more versatile and inexpensive terminals will make computer graphics more feasible in science instruction than before. This paper describes the use of graphics in physics teaching at the University of California at Irvine. Commands and software are detailed in established programs, which include a lunar landing simulation and a program which…

  18. Comparative Analysis of Palm and Wearable Computers for Participatory Simulations

    ERIC Educational Resources Information Center

    Klopfer, Eric; Yoon, Susan; Rivas, Luz

    2004-01-01

    Recent educational computer-based technologies have offered promising lines of research that promote social constructivist learning goals, develop skills required to operate in a knowledge-based economy (Roschelle et al. 2000), and enable more authentic science-like problem-solving. In our research programme, we have been interested in combining…

  19. The impact of supercomputers on experimentation: A view from a national laboratory

    NASA Technical Reports Server (NTRS)

    Peterson, V. L.; Arnold, J. O.

    1985-01-01

    The relative roles of large scale scientific computers and physical experiments in several science and engineering disciplines are discussed. Increasing dependence on computers is shown to be motivated both by the rapid growth in computer speed and memory, which permits accurate numerical simulation of complex physical phenomena, and by the rapid reduction in the cost of performing a calculation, which makes computation an increasingly attractive complement to experimentation. Computer speed and memory requirements are presented for selected areas of such disciplines as fluid dynamics, aerodynamics, aerothermodynamics, chemistry, atmospheric sciences, astronomy, and astrophysics, together with some examples of the complementary nature of computation and experiment. Finally, the impact of the emerging role of computers in the technical disciplines is discussed in terms of both the requirements for experimentation and the attainment of previously inaccessible information on physical processes.

  20. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    NASA Astrophysics Data System (ADS)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  1. Terascale direct numerical simulations of turbulent combustion using S3D

    NASA Astrophysics Data System (ADS)

    Chen, J. H.; Choudhary, A.; de Supinski, B.; DeVries, M.; Hawkes, E. R.; Klasky, S.; Liao, W. K.; Ma, K. L.; Mellor-Crummey, J.; Podhorszki, N.; Sankaran, R.; Shende, S.; Yoo, C. S.

    2009-01-01

    Computational science is paramount to the understanding of underlying processes in internal combustion engines of the future that will utilize non-petroleum-based alternative fuels, including carbon-neutral biofuels, and burn in new combustion regimes that will attain high efficiency while minimizing emissions of particulates and nitrogen oxides. Next-generation engines will likely operate at higher pressures, with greater amounts of dilution and utilize alternative fuels that exhibit a wide range of chemical and physical properties. Therefore, there is a significant role for high-fidelity simulations, direct numerical simulations (DNS), specifically designed to capture key turbulence-chemistry interactions in these relatively uncharted combustion regimes, and in particular, that can discriminate the effects of differences in fuel properties. In DNS, all of the relevant turbulence and flame scales are resolved numerically using high-order accurate numerical algorithms. As a consequence terascale DNS are computationally intensive, require massive amounts of computing power and generate tens of terabytes of data. Recent results from terascale DNS of turbulent flames are presented here, illustrating its role in elucidating flame stabilization mechanisms in a lifted turbulent hydrogen/air jet flame in a hot air coflow, and the flame structure of a fuel-lean turbulent premixed jet flame. Computing at this scale requires close collaborations between computer and combustion scientists to provide optimized scaleable algorithms and software for terascale simulations, efficient collective parallel I/O, tools for volume visualization of multiscale, multivariate data and automating the combustion workflow. The enabling computer science, applied to combustion science, is also required in many other terascale physics and engineering simulations. In particular, performance monitoring is used to identify the performance of key kernels in the DNS code, S3D and especially memory intensive loops in the code. Through the careful application of loop transformations, data reuse in cache is exploited thereby reducing memory bandwidth needs, and hence, improving S3D's nodal performance. To enhance collective parallel I/O in S3D, an MPI-I/O caching design is used to construct a two-stage write-behind method for improving the performance of write-only operations. The simulations generate tens of terabytes of data requiring analysis. Interactive exploration of the simulation data is enabled by multivariate time-varying volume visualization. The visualization highlights spatial and temporal correlations between multiple reactive scalar fields using an intuitive user interface based on parallel coordinates and time histogram. Finally, an automated combustion workflow is designed using Kepler to manage large-scale data movement, data morphing, and archival and to provide a graphical display of run-time diagnostics.

  2. Computer Simulations for Lab Experiences in Secondary Physics

    ERIC Educational Resources Information Center

    Murphy, David Shannon

    2012-01-01

    Physical science instruction often involves modeling natural systems, such as electricity that possess particles which are invisible to the unaided eye. The effect of these particles' motion is observable, but the particles are not directly observable to humans. Simulations have been developed in physics, chemistry and biology that, under certain…

  3. Design and Performance Frameworks for Constructing Problem-Solving Simulations

    ERIC Educational Resources Information Center

    Stevens, Rons; Palacio-Cayetano, Joycelin

    2003-01-01

    Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks…

  4. Long-time atomistic simulations with the Parallel Replica Dynamics method

    NASA Astrophysics Data System (ADS)

    Perez, Danny

    Molecular Dynamics (MD) -- the numerical integration of atomistic equations of motion -- is a workhorse of computational materials science. Indeed, MD can in principle be used to obtain any thermodynamic or kinetic quantity, without introducing any approximation or assumptions beyond the adequacy of the interaction potential. It is therefore an extremely powerful and flexible tool to study materials with atomistic spatio-temporal resolution. These enviable qualities however come at a steep computational price, hence limiting the system sizes and simulation times that can be achieved in practice. While the size limitation can be efficiently addressed with massively parallel implementations of MD based on spatial decomposition strategies, allowing for the simulation of trillions of atoms, the same approach usually cannot extend the timescales much beyond microseconds. In this article, we discuss an alternative parallel-in-time approach, the Parallel Replica Dynamics (ParRep) method, that aims at addressing the timescale limitation of MD for systems that evolve through rare state-to-state transitions. We review the formal underpinnings of the method and demonstrate that it can provide arbitrarily accurate results for any definition of the states. When an adequate definition of the states is available, ParRep can simulate trajectories with a parallel speedup approaching the number of replicas used. We demonstrate the usefulness of ParRep by presenting different examples of materials simulations where access to long timescales was essential to access the physical regime of interest and discuss practical considerations that must be addressed to carry out these simulations. Work supported by the United States Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division.

  5. Promoting Systems Thinking through Biology Lessons

    NASA Astrophysics Data System (ADS)

    Riess, Werner; Mischo, Christoph

    2010-04-01

    This study's goal was to analyze various teaching approaches within the context of natural science lessons, especially in biology. The main focus of the paper lies on the effectiveness of different teaching methods in promoting systems thinking in the field of Education for Sustainable Development. The following methods were incorporated into the study: special lessons designed to promote systems thinking, a computer-simulated scenario on the topic "ecosystem forest," and a combination of both special lessons and the computer simulation. These groups were then compared to a control group. A questionnaire was used to assess systems thinking skills of 424 sixth-grade students of secondary schools in Germany. The assessment differentiated between a conceptual understanding (measured as achievement score) and a reflexive justification (measured as justification score) of systems thinking. The following control variables were used: logical thinking, grades in school, memory span, and motivational goal orientation. Based on the pretest-posttest control group design, only those students who received both special instruction and worked with the computer simulation showed a significant increase in their achievement scores. The justification score increased in the computer simulation condition as well as in the combination of computer simulation and lesson condition. The possibilities and limits of promoting various forms of systems thinking by using realistic computer simulations are discussed.

  6. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  7. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  8. Insight Center | Computational Science | NREL

    Science.gov Websites

    effectively convey information and illustrate research findings to stakeholders and visitors. The -turbine array simulations. Observational data span from the nanostructures of biomass pretreatments to the

  9. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  10. Basic Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Basic Energy Sciences, November 3-5, 2015, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Windus, Theresa; Banda, Michael; Devereaux, Thomas

    Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. Wemore » could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy Sciences (BES) mission need. Simulation, visualization, and data analysis are crucial for advances in energy science and technology. Revolutionary mathematical, software, and algorithm developments are required in all areas of BES science to take advantage of exascale computing architectures and to meet data analysis, management, and workflow needs. In partnership with ASCR, BES has an emerging and pressing need to develop new and disruptive capabilities in data science. More capable and larger high-performance computing (HPC) and data ecosystems are required to support priority research in BES. Continued success in BES research requires developing the next-generation workforce through education and training and by providing sustained career opportunities.« less

  11. Enhancing Student Learning in Food Engineering Using Computational Fluid Dynamics Simulations

    ERIC Educational Resources Information Center

    Wong, Shin Y.; Connelly, Robin K.; Hartel, Richard W.

    2010-01-01

    The current generation of students coming into food science and engineering programs is very visually oriented from their early experiences. To increase their interest in learning, new and visually appealing teaching materials need to be developed. Two diverse groups of students may be identified based on their math skills. Food science students…

  12. Opening Comments: SciDAC 2008

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2008-07-01

    Welcome to Seattle and the 2008 SciDAC Conference. This conference, the fourth in the series, is a continuation of the PI meetings we first began under SciDAC-1. I would like to start by thanking the organizing committee, and Rick Stevens in particular, for organizing this year's meeting. This morning I would like to look briefly at SciDAC, to give you a brief history of SciDAC and also look ahead to see where we plan to go over the next few years. I think the best description of SciDAC, at least the simulation part, comes from a quote from Dr Ray Orbach, DOE's Under Secretary for Science and Director of the Office of Science. In an interview that appeared in the SciDAC Review magazine, Dr Orbach said, `SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing'. Of course, that is extended not just to physical scientists, but also to biological scientists. This is a theme of computational science, this partnership among disciplines, which goes all the way back to the early 1980s and Ken Wilson. It's a unique thread within the Department of Energy. SciDAC-1, launched around the turn of the millennium, created a new generation of scientific simulation codes. It advocated building out mathematical and computing system software in support of science and a new collaboratory software environment for data. The original concept for SciDAC-1 had topical centers for the execution of the various science codes, but several corrections and adjustments were needed. The ASCR scientific computing infrastructure was also upgraded, providing the hardware facilities for the program. The computing facility that we had at that time was the big 3 teraflop/s center at NERSC and that had to be shared with the programmatic side supporting research across DOE. At the time, ESnet was just slightly over half a gig per sec of bandwidth; and the science being addressed was accelerator science, climate, chemistry, fusion, astrophysics, materials science, and QCD. We built out the national collaboratories from the ASCR office, and in addition we built Integrated Software Infrastructure Centers (ISICs). Of these, three were in applied mathematics, four in computer science (including a performance evaluation research center), and four were collaboratories or Grid projects having to do with data management. For science, there were remarkable breakthroughs in simulation, such as full 3D laboratory scale flame simulation. There were also significant improvements in application codes - from factors of almost 3 to more than 100 - and code improvement as people began to realize they had to integrate mathematics tools and computer science tools into their codes to take advantage of the parallelism of the day. The SciDAC data-mining tool, Sapphire, received a 2006 R&D 100 award. And the community as a whole worked well together and began building a publication record that was substantial. In 2006, we recompeted the program with similar goals - SciDAC-1 was very successful, and we wanted to continue that success and extend what was happening under SciDAC to the broader science community. We opened up the partnership to all of the Offices of Science and the NSF and the NNSA. The goal was to create comprehensive scientific computing software and the infrastructure for the software to enable scientific discovery in the physical, biological, and environmental sciences and take the simulations to an extreme scale, in this case petascale. We would also build out a new generation of data management tools. What we observed during SciDAC-1 was that the data and the data communities - both experimental data from large experimental facilities and observational data, along with simulation data - were expanding at a rate significantly faster than Moore's law. In the past few weeks, the FastBit indexing technology software tool for data analyses and data mining developed under SciDAC's Scientific Data Management project was recognized with an R&D 100 Award, selected by an independent judging panel and editors of R&D Magazine as one of the 100 most technologically significant products introduced into the marketplace over the past year. For SciDAC-2 we had nearly 250 proposals requesting a total of slightly over 1 billion in funding. Of course, we had nowhere near 1 billion. The facilities and the science we ended up with were not significantly different from what we had in SciDAC-1. But we had put in place substantially increased facilities for science. When SciDAC-1 was originally executed with the facilities at NERSC, there was significant impact on the resources at NERSC, because not only did we have an expanding portfolio of programmatic science, but we had the SciDAC projects that also needed to run at NERSC. Suddenly, NERSC was incredibly oversubscribed. With SciDAC-2, we had in place leadership-class computing facilities at Argonne with slightly more than half a petaflop and at Oak Ridge with slightly more than a quarter petaflop with an upgrade planned at the end of this year for a petaflop. And we increased the production computing capacity at NERSC to 104 teraflop/s just so that we would not impact the programmatic research and so that we would have a startup facility for SciDAC. At the end of the summer, NERSC will be at 360 teraflop/s. Both the Oak Ridge system and the principal resource at NERSC are Cray systems; Argonne has a different architecture, an IBM Blue Gene/P. At the same time, ESnet has been built out, and we are on a path where we will have dual rings around the country, from 10 to 40 gigabits per second - a factor of 20 to 80 over what was available during SciDAC-1. The science areas include accelerator science and simulation, astrophysics, climate modeling and simulation, computational biology, fusion science, high-energy physics, petabyte high-energy/ nuclear physics, materials science and chemistry, nuclear physics, QCD, radiation transport, turbulence, and groundwater reactive transport modeling and simulation. They were supported by new enabling technology centers and university-based institutes to develop an educational thread for the SciDAC program. There were four mathematics projects and four computer science projects; and under data management, we see a significant difference in that we are bringing up new visualization projects to support and sustain data-intensive science. When we look at the budgets, we see growth in the budget from just under 60 million for SciDAC-1 to just over 80 for SciDAC-2. Part of the growth is due to bringing in NSF and NNSA as new partners, and some of the growth is due to some program offices increasing their investment in SciDAC, while other program offices are constant or have decreased their investment. This is not a reflection of their priorities per se but, rather, a reflection of the budget process and the difficult times in Washington during the past two years. New activities are under way in SciDAC - the annual PI meeting has turned into what I would describe as the premier interdisciplinary computational science meeting, one of the best in the world. Doing interdisciplinary meetings is difficult because people tend to develop a focus for their particular subject area. But this is the fourth in the series; and since the first meeting in San Francisco, these conferences have been remarkably successful. For SciDAC-2 we also created an outreach magazine, SciDAC Review, which highlights scientific discovery as well as high-performance computing. It's been very successful in telling the non-practitioners what SciDAC and computational science are all about. The other new instrument in SciDAC-2 is an outreach center. As we go from computing at the terascale to computing at the petascale, we face the problem of narrowing our research community. The number of people who are `literate' enough to compute at the terascale is more than the number of those who can compute at the petascale. To address this problem, we established the SciDAC Outreach Center to bring people into the fold and educate them as to how we do SciDAC, how the teams are composed, and what it really means to compute at scale. The resources I have mentioned don't come for free. As part of the HECRTF law of 2005, Congress mandated that the Secretary would ensure that leadership-class facilities would be open to everyone across all agencies. So we took Congress at its word, and INCITE is our instrument for making allocations at the leadership-class facilities at Argonne and Oak Ridge, as well as smaller allocations at NERSC. Therefore, the selected proposals are very large projects that are computationally intensive, that compute at scale, and that have a high science impact. An important feature is that INCITE is completely open to anyone - there is no requirement of DOE Office of Science funding, and proposals are rigorously reviewed for both the science and the computational readiness. In 2008, more than 100 proposals were received, requesting about 600 million processor-hours. We allocated just over a quarter of a billion processor-hours. Astrophysics, materials science, lattice gauge theory, and high energy and nuclear physics were the major areas. These were the teams that were computationally ready for the big machines and that had significant science they could identify. In 2009, there will be a significant increase amount of time to be allocated, over half a billion processor-hours. The deadline is August 11 for new proposals and September 12 for renewals. We anticipate a significant increase in the number of requests this year. We expect you - as successful SciDAC centers, institutes, or partnerships - to compete for and win INCITE program allocation awards. If you have a successful SciDAC proposal, we believe it will make you successful in the INCITE review. We have the expectation that you will among those most prepared and most ready to use the machines and to compute at scale. Over the past 18 months, we have assembled a team to look across our computational science portfolio and to judge what are the 10 most significant science accomplishments. The ASCR office, as it goes forward with OMB, the new administration, and Congress, will be judged by the science we have accomplished. All of our proposals - such as for increasing SciDAC, increasing applied mathematics, and so on - are tied to what have we accomplished in science. And so these 10 big accomplishments are key to establishing credibility for new budget requests. Tony Mezzacappa, who chaired the committee, will also give a presentation on the ranking of these top 10, how they got there, and what the science is all about. Here is the list - numbers 2, 5, 6, 7, 9, and 10 are all SciDAC projects. RankTitle 1Modeling the Molecular Basis of Parkinson's Disease (Tsigelny) 2Discovery of the Standing Accretion Shock Instability and Pulsar Birth Mechanism in a Core-Collapse Supernova Evolution and Explosion (Blondin) 3Prediction and Design of Macromolecular Structures and Functions (Baker) 4Understanding How Lifted Flame Stabilized in a Hot Coflow (Yoo) 5New Insights from LCF-enabled Advanced Kinetic Simulations of Global Turbulence in Fusion Systems (Tang) 6High Transition Temperature Superconductivity: A High-Temperature Superconductive State and a Pairing Mechanism in 2-D Hubbard Model (Scalapino) 7 PETsc: Providing the Solvers for DOE High-Performance Simulations (Smith) 8 Via Lactea II, A Billion Particle Simulation of the Dark Matter Halo of the Milky Way (Madau) 9Probing the Properties of Water through Advanced Computing (Galli) 10First Provably Scalable Maxwell Solver Enables Scalable Electromagnetic Simulations (Kovel) So, what's the future going to look like for us? The office is putting together an initiative with the community, which we call the E3 Initiative. We're looking for a 10-year horizon for what's going to happen. Through the series of town hall meetings, which many of you participated in, we have produced a document on `Transforming Energy, the Environment and Science through simulations at the eXtreme Scale'; it can be found at http://www.science.doe.gov/ascr/ProgramDocuments/TownHall.pdf . We sometimes call it the Exascale initiative. Exascale computing is the gold-ring level of computing that seems just out of reach; but if we work hard and stretch, we just might be able to reach it. We envision that there will be a SciDAC-X, working at the extreme scale, with SciDAC teams that will perform and carry out science in the areas that will have a great societal impact, such as alternative fuels and transportation, combustion, climate, fusion science, high-energy physics, advanced fuel cycles, carbon management, and groundwater. We envision institutes for applied mathematics and computer science that probably will segue into algorithms because, at the extreme scale, we see the distinction between the applied math and the algorithm per se and its implementation in computer science as being inseparable. We envision an INCITE-X with multi-petaflop platforms, perhaps even exaflop computing resources. ESnet will be best in class - our 10-year plan calls for having 400 terabits per second capacity available in dual rings around the country, an enormously fast data communications network for moving large amounts of data. In looking at where we've been and where we are going, we can see that the gigaflops and teraflops era was a regime where we were following Moore's law through advances in clock speed. In the current regime, we're introducing massive parallelism, which I think is exemplified by Intel's announcement of their teraflop chip, where they envision more than a thousand cores on a chip. But in order to reach exascale, extrapolations talk about machines that require 100 megawatts of power in terms of current architectures. It's clearly going to require novel architectures, things we have perhaps not yet envisioned. It is of course an era of challenge. There will be an unpredictable evolution of hardware if we are to reach the exascale; and there will clearly be multilevel heterogeneous parallelism, including multilevel memory hierarchies. We have no idea right now as to the programming models needed to execute at such an extreme scale. We have been incredibly successful at the petascale - we know that already. Managing data and just getting communications to scale is an enormous challenge. And it's not just the extreme scaling. It's the rapid increase in complexity that represents the challenge. Let me end with a metaphor. In previous meetings we have talked about the road to petascale. Indeed, we have seen in hindsight that it was a road well traveled. But perhaps the road to exascale is not a road at all. Perhaps the metaphor will be akin to scaling the south face of K2. That's clearly not something all of us will be able to do, and probably computing at the exascale is not something all of us will do. But if we achieve that goal, perhaps the words of Emily Dickinson will best summarize where we will be. Perhaps in her words, looking backward and down, you will say: I climb the `Hill of Science' I view the landscape o'er; Such transcendental prospect I ne'er beheld before!

  13. A Review of Computational Methods in Materials Science: Examples from Shock-Wave and Polymer Physics

    PubMed Central

    Steinhauser, Martin O.; Hiermaier, Stefan

    2009-01-01

    This review discusses several computational methods used on different length and time scales for the simulation of material behavior. First, the importance of physical modeling and its relation to computer simulation on multiscales is discussed. Then, computational methods used on different scales are shortly reviewed, before we focus on the molecular dynamics (MD) method. Here we survey in a tutorial-like fashion some key issues including several MD optimization techniques. Thereafter, computational examples for the capabilities of numerical simulations in materials research are discussed. We focus on recent results of shock wave simulations of a solid which are based on two different modeling approaches and we discuss their respective assets and drawbacks with a view to their application on multiscales. Then, the prospects of computer simulations on the molecular length scale using coarse-grained MD methods are covered by means of examples pertaining to complex topological polymer structures including star-polymers, biomacromolecules such as polyelectrolytes and polymers with intrinsic stiffness. This review ends by highlighting new emerging interdisciplinary applications of computational methods in the field of medical engineering where the application of concepts of polymer physics and of shock waves to biological systems holds a lot of promise for improving medical applications such as extracorporeal shock wave lithotripsy or tumor treatment. PMID:20054467

  14. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glotzer, S. C.; Kim, S.; Cummings, P. T.

    This WTEC panel report assesses the international research and development activities in the field of Simulation- Based Engineering and Science (SBE&S). SBE&S involves the use of computer modeling and simulation to solve mathematical formulations of physical models of engineered and natural systems. SBE&S today has reached a level of predictive capability that it now firmly complements the traditional pillars of theory and experimentation/observation. As a result, computer simulation is more pervasive today – and having more impact – than at any other time in human history. Many critical technologies, including those to develop new energy sources and to shift themore » cost-benefit factors in healthcare, are on the horizon that cannot be understood, developed, or utilized without simulation. A panel of experts reviewed and assessed the state of the art in SBE&S as well as levels of activity overseas in the broad thematic areas of life sciences and medicine, materials, and energy and sustainability; and in the crosscutting issues of next generation hardware and algorithms; software development; engineering simulations; validation, verification, and uncertainty quantification; multiscale modeling and simulation; and SBE&S education. The panel hosted a U.S. baseline workshop, conducted a bibliometric analysis, consulted numerous experts and reports, and visited 59 institutions and companies throughout East Asia and Western Europe to explore the active research projects in those institutions, the computational infrastructure used for the projects, the funding schemes that enable the research, the collaborative interactions among universities, national laboratories, and corporate research centers, and workforce needs and development for SBE&S.« less

  16. Using evolutionary computations to understand the design and evolution of gene and cell regulatory networks.

    PubMed

    Spirov, Alexander; Holloway, David

    2013-07-15

    This paper surveys modeling approaches for studying the evolution of gene regulatory networks (GRNs). Modeling of the design or 'wiring' of GRNs has become increasingly common in developmental and medical biology, as a means of quantifying gene-gene interactions, the response to perturbations, and the overall dynamic motifs of networks. Drawing from developments in GRN 'design' modeling, a number of groups are now using simulations to study how GRNs evolve, both for comparative genomics and to uncover general principles of evolutionary processes. Such work can generally be termed evolution in silico. Complementary to these biologically-focused approaches, a now well-established field of computer science is Evolutionary Computations (ECs), in which highly efficient optimization techniques are inspired from evolutionary principles. In surveying biological simulation approaches, we discuss the considerations that must be taken with respect to: (a) the precision and completeness of the data (e.g. are the simulations for very close matches to anatomical data, or are they for more general exploration of evolutionary principles); (b) the level of detail to model (we proceed from 'coarse-grained' evolution of simple gene-gene interactions to 'fine-grained' evolution at the DNA sequence level); (c) to what degree is it important to include the genome's cellular context; and (d) the efficiency of computation. With respect to the latter, we argue that developments in computer science EC offer the means to perform more complete simulation searches, and will lead to more comprehensive biological predictions. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Health Science Education

    ERIC Educational Resources Information Center

    Hartsell, Horace C.

    1970-01-01

    Briefly describes several instructional techniques including computer aid simulation of the medical encounter, media-biased approaches for teaching doctor-patient relationships, and programed media for teaching decision-making to nursing students." (Author/AA)

  18. PREFACE: New trends in Computer Simulations in Physics and not only in physics

    NASA Astrophysics Data System (ADS)

    Shchur, Lev N.; Krashakov, Serge A.

    2016-02-01

    In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf

  19. Learning technologies and the cyber-science classroom

    NASA Astrophysics Data System (ADS)

    Houlihan, Gerard

    Access to computer and communication technology has long been regarded `part-and-parcel' of a good education. No educator can afford to ignore the profound impact of learning technologies on the way we teach science, nor fail to acknowledge that information literacy and computing skills will be fundamental to the practice of science in the next millennium. Nevertheless, there is still confusion concerning what technologies educators should employ in teaching science. Furthermore, a lack of knowledge combined with the pressures to be `seen' utilizing technology has lead some schools to waste scarce resources in a `grab-bag' attitude towards computers and technology. Such popularized `wish lists' can only drive schools to accumulate expensive equipment for no real learning purpose. In the future educators will have to reconsider their curriculum and pedagogy with a focus on the learning environment before determining what appropriate computing resources to acquire. This will be fundamental to the capabilities of science classrooms to engage with cutting-edge issues in science. This session will demonstrate the power of a broad range of learning technologies to enhance science education. The aim is to explore classroom possibilities as well as to provide a basic introduction to technical aspects of various software and hardware applications, including robotics and dataloggers and simulation software.

  20. Computational sciences in the upstream oil and gas industry

    PubMed Central

    Halsey, Thomas C.

    2016-01-01

    The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785

  1. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  2. Evaluating Implementations of Service Oriented Architecture for Sensor Network via Simulation

    DTIC Science & Technology

    2011-04-01

    Subject: COMPUTER SCIENCE Approved: Boleslaw Szymanski , Thesis Adviser Rensselaer Polytechnic Institute Troy, New York April 2011 (For Graduation May 2011...simulation supports distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space...distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space. The second simulation

  3. Computer Simulation of Compression and Energy Release upon Laser Irradiation of Cylindrically Symmetric Target

    NASA Astrophysics Data System (ADS)

    Kuzenov, V. V.

    2017-12-01

    The paper is devoted to the theoretical and computational study of compression and energy release for magneto-inertial plasma confinement. This approach makes it possible to create new high-density plasma sources, apply them in materials science experiments, and use them in promising areas of power engineering.

  4. Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science

    NASA Technical Reports Server (NTRS)

    Young, Gerald W.; Clemons, Curtis B.

    2004-01-01

    The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.

  5. CDAC Student Report: Summary of LLNL Internship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herriman, Jane E.

    Multiple objectives motivated me to apply for an internship at LLNL: I wanted to experience the work environment at a national lab, to learn about research and job opportunities at LLNL in particular, and to gain greater experience with code development, particularly within the realm of high performance computing (HPC). This summer I was selected to participate in LLNL's Computational Chemistry and Material Science Summer Institute (CCMS). CCMS is a 10 week program hosted by the Quantum Simulations group leader, Dr. Eric Schwegler. CCMS connects graduate students to mentors at LLNL involved in similar re- search and provides weekly seminarsmore » on a broad array of topics from within chemistry and materials science. Dr. Xavier Andrade and Dr. Erik Draeger served as my co-mentors over the summer, and Dr. Andrade continues to mentor me now that CCMS has concluded. Dr. Andrade is a member of the Quantum Simulations group within the Physical and Life Sciences at LLNL, and Dr. Draeger leads the HPC group within the Center for Applied Scientific Computing (CASC). The two have worked together to develop Qb@ll, an open-source first principles molecular dynamics code that was the platform for my summer research project.« less

  6. ASC FY17 Implementation Plan, Rev. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, P. G.

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.« less

  7. Fusion Simulation Project Workshop Report

    NASA Astrophysics Data System (ADS)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  8. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    NASA Astrophysics Data System (ADS)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  9. Report on Project Action Sheet PP05 task 3 between the U.S. Department of Energy and the Republic of Korea Ministry of Education, Science, and Technology (MEST).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, Mark Kamerer

    2013-01-01

    This report documents the results of Task 3 of Project Action Sheet PP05 between the United States Department of Energy (DOE) and the Republic of Korea (ROK) Ministry of Education, Science, and Technology (MEST) for Support with Review of an ROK Risk Evaluation Process. This task was to have Sandia National Laboratories collaborate with the Korea Institute of Nuclear Nonproliferation and Control (KINAC) on several activities concerning how to determine the Probability of Neutralization, PN, and the Probability of System Effectiveness, PE, to include: providing descriptions on how combat simulations are used to determine PN and PE; comparisons of themore » strengths and weaknesses of two neutralization models (the Neutralization.xls spreadsheet model versus the Brief Adversary Threat-Loss Estimator (BATLE) software); and demonstrating how computer simulations can be used to determine PN. Note that the computer simulation used for the demonstration was the Scenario Toolkit And Generation Environment (STAGE) simulation, which is a stand-alone synthetic tactical simulation sold by Presagis Canada Incorporated. The demonstration is provided in a separate Audio Video Interleave (.AVI) file.« less

  10. Computer Simulation for Pain Management Education: A Pilot Study.

    PubMed

    Allred, Kelly; Gerardi, Nicole

    2017-10-01

    Effective pain management is an elusive concept in acute care. Inadequate knowledge has been identified as a barrier to providing optimal pain management. This study aimed to determine student perceptions of an interactive computer simulation as a potential method for learning pain management, as a motivator to read and learn more about pain management, preference over traditional lecture, and its potential to change nursing practice. A post-computer simulation survey with a mixed-methods descriptive design was used in this study. A college of nursing in a large metropolitan university in the Southeast United States. A convenience sample of 30 nursing students in a Bachelor of Science nursing program. An interactive computer simulation was developed as a potential alternative method of teaching pain management to nursing students. Increases in educational gain as well as its potential to change practice were explored. Each participant was asked to complete a survey consisting of 10 standard 5-point Likert scale items and 5 open-ended questions. The survey was used to evaluate the students' perception of the simulation, specifically related to educational benefit, preference compared with traditional teaching methods, and perceived potential to change nursing practice. Data provided descriptive statistics for initial evaluation of the computer simulation. The responses on the survey suggest nursing students perceive the computer simulation to be entertaining, fun, educational, occasionally preferred over regular lecture, and with potential to change practice. Preliminary data support the use of computer simulation in educating nursing students about pain management. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  11. Mathematical modeling based on ordinary differential equations: A promising approach to vaccinology

    PubMed Central

    Bonin, Carla Rezende Barbosa; Fernandes, Guilherme Cortes; dos Santos, Rodrigo Weber; Lobosco, Marcelo

    2017-01-01

    ABSTRACT New contributions that aim to accelerate the development or to improve the efficacy and safety of vaccines arise from many different areas of research and technology. One of these areas is computational science, which traditionally participates in the initial steps, such as the pre-screening of active substances that have the potential to become a vaccine antigen. In this work, we present another promising way to use computational science in vaccinology: mathematical and computational models of important cell and protein dynamics of the immune system. A system of Ordinary Differential Equations represents different immune system populations, such as B cells and T cells, antigen presenting cells and antibodies. In this way, it is possible to simulate, in silico, the immune response to vaccines under development or under study. Distinct scenarios can be simulated by varying parameters of the mathematical model. As a proof of concept, we developed a model of the immune response to vaccination against the yellow fever. Our simulations have shown consistent results when compared with experimental data available in the literature. The model is generic enough to represent the action of other diseases or vaccines in the human immune system, such as dengue and Zika virus. PMID:28027002

  12. Mathematical modeling based on ordinary differential equations: A promising approach to vaccinology.

    PubMed

    Bonin, Carla Rezende Barbosa; Fernandes, Guilherme Cortes; Dos Santos, Rodrigo Weber; Lobosco, Marcelo

    2017-02-01

    New contributions that aim to accelerate the development or to improve the efficacy and safety of vaccines arise from many different areas of research and technology. One of these areas is computational science, which traditionally participates in the initial steps, such as the pre-screening of active substances that have the potential to become a vaccine antigen. In this work, we present another promising way to use computational science in vaccinology: mathematical and computational models of important cell and protein dynamics of the immune system. A system of Ordinary Differential Equations represents different immune system populations, such as B cells and T cells, antigen presenting cells and antibodies. In this way, it is possible to simulate, in silico, the immune response to vaccines under development or under study. Distinct scenarios can be simulated by varying parameters of the mathematical model. As a proof of concept, we developed a model of the immune response to vaccination against the yellow fever. Our simulations have shown consistent results when compared with experimental data available in the literature. The model is generic enough to represent the action of other diseases or vaccines in the human immune system, such as dengue and Zika virus.

  13. Technology 2001: The Second National Technology Transfer Conference and Exposition, volume 1

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Papers from the technical sessions of the Technology 2001 Conference and Exposition are presented. The technical sessions featured discussions of advanced manufacturing, artificial intelligence, biotechnology, computer graphics and simulation, communications, data and information management, electronics, electro-optics, environmental technology, life sciences, materials science, medical advances, robotics, software engineering, and test and measurement.

  14. A symbiotic approach to fluid equations and non-linear flux-driven simulations of plasma dynamics

    NASA Astrophysics Data System (ADS)

    Halpern, Federico

    2017-10-01

    The fluid framework is ubiquitous in studies of plasma transport and stability. Typical forms of the fluid equations are motivated by analytical work dating several decades ago, before computer simulations were indispensable, and can be, therefore, not optimal for numerical computation. We demonstrate a new first-principles approach to obtaining manifestly consistent, skew-symmetric fluid models, ensuring internal consistency and conservation properties even in discrete form. Mass, kinetic, and internal energy become quadratic (and always positive) invariants of the system. The model lends itself to a robust, straightforward discretization scheme with inherent non-linear stability. A simpler, drift-ordered form of the equations is obtained, and first results of their numerical implementation as a binary framework for bulk-fluid global plasma simulations are demonstrated. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, Theory Program, under Award No. DE-FG02-95ER54309.

  15. Projective simulation for artificial intelligence

    NASA Astrophysics Data System (ADS)

    Briegel, Hans J.; de Las Cuevas, Gemma

    2012-05-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  16. Projective simulation for artificial intelligence

    PubMed Central

    Briegel, Hans J.; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation. PMID:22590690

  17. High-Performance Algorithms and Complex Fluids | Computational Science |

    Science.gov Websites

    only possible by combining experimental data with simulation. Capabilities Capabilities include: Block -laden, non-Newtonian, as well as traditional internal and external flows. Contact Ray Grout Group

  18. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  19. An Ada Object Oriented Missile Flight Simulation

    DTIC Science & Technology

    1991-09-01

    identify by block number) This thesis uses the Ada programming language in the design and development of an air-to-air missile flight simulation with...object oriented techniques and sound software engineering principles. The simulation is designed to be more understandable, modifiable, efficient and...Department of Computer Science ii ABSTRACT This thesis uses the Ada programming language in the design and development of an air-to-air missile flight

  20. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  1. Computational Study on Atomic Structures, Electronic Properties, and Chemical Reactions at Surfaces and Interfaces and in Biomaterials

    NASA Astrophysics Data System (ADS)

    Takano, Yu; Kobayashi, Nobuhiko; Morikawa, Yoshitada

    2018-06-01

    Through computer simulations using atomistic models, it is becoming possible to calculate the atomic structures of localized defects or dopants in semiconductors, chemically active sites in heterogeneous catalysts, nanoscale structures, and active sites in biological systems precisely. Furthermore, it is also possible to clarify physical and chemical properties possessed by these nanoscale structures such as electronic states, electronic and atomic transport properties, optical properties, and chemical reactivity. It is sometimes quite difficult to clarify these nanoscale structure-function relations experimentally and, therefore, accurate computational studies are indispensable in materials science. In this paper, we review recent studies on the relation between local structures and functions for inorganic, organic, and biological systems by using atomistic computer simulations.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlburg, Jill; Corones, James; Batchelor, Donald

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less

  3. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  4. Towards the simulation of molecular collisions with a superconducting quantum computer

    NASA Astrophysics Data System (ADS)

    Geller, Michael

    2013-05-01

    I will discuss the prospects for the use of large-scale, error-corrected quantum computers to simulate complex quantum dynamics such as molecular collisions. This will likely require millions qubits. I will also discuss an alternative approach [M. R. Geller et al., arXiv:1210.5260] that is ideally suited for today's superconducting circuits, which uses the single-excitation subspace (SES) of a system of n tunably coupled qubits. The SES method allows many operations in the unitary group SU(n) to be implemented in a single step, bypassing the need for elementary gates, thereby making large computations possible without error correction. The method enables universal quantum simulation, including simulation of the time-dependent Schrodinger equation, and we argue that a 1000-qubit SES processor should be capable of achieving quantum speedup relative to a petaflop supercomputer. We speculate on the utility and practicality of such a simulator for atomic and molecular collision physics. Work supported by the US National Science Foundation CDI program.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Ann E; Barker, Ashley D; Bland, Arthur S Buddy

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of thesemore » we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation billions of gallons of fuel.« less

  6. The (human) science of medical virtual learning environments.

    PubMed

    Stone, Robert J

    2011-01-27

    The uptake of virtual simulation technologies in both military and civilian surgical contexts has been both slow and patchy. The failure of the virtual reality community in the 1990s and early 2000s to deliver affordable and accessible training systems stems not only from an obsessive quest to develop the 'ultimate' in so-called 'immersive' hardware solutions, from head-mounted displays to large-scale projection theatres, but also from a comprehensive lack of attention to the needs of the end users. While many still perceive the science of simulation to be defined by technological advances, such as computing power, specialized graphics hardware, advanced interactive controllers, displays and so on, the true science underpinning simulation--the science that helps to guarantee the transfer of skills from the simulated to the real--is that of human factors, a well-established discipline that focuses on the abilities and limitations of the end user when designing interactive systems, as opposed to the more commercially explicit components of technology. Based on three surgical simulation case studies, the importance of a human factors approach to the design of appropriate simulation content and interactive hardware for medical simulation is illustrated. The studies demonstrate that it is unnecessary to pursue real-world fidelity in all instances in order to achieve psychological fidelity--the degree to which the simulated tasks reproduce and foster knowledge, skills and behaviours that can be reliably transferred to real-world training applications.

  7. Applications of Computer Graphics in Engineering

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Various applications of interactive computer graphics to the following areas of science and engineering were described: design and analysis of structures, configuration geometry, animation, flutter analysis, design and manufacturing, aircraft design and integration, wind tunnel data analysis, architecture and construction, flight simulation, hydrodynamics, curve and surface fitting, gas turbine engine design, analysis, and manufacturing, packaging of printed circuit boards, spacecraft design.

  8. Integrated computational materials engineering: Tools, simulations and new applications

    DOE PAGES

    Madison, Jonathan D.

    2016-03-30

    Here, Integrated Computational Materials Engineering (ICME) is a relatively new methodology full of tremendous potential to revolutionize how science, engineering and manufacturing work together. ICME was motivated by the desire to derive greater understanding throughout each portion of the development life cycle of materials, while simultaneously reducing the time between discovery to implementation [1,2].

  9. ONR Europe Reports. Computer Science/Computer Engineering in Central Europe: A Report on Czechoslovakia, Hungary, and Poland

    DTIC Science & Technology

    1992-08-01

    Rychlik J.: Simulation of distributed control systems. Research report of Institute of Technology in 22 Pilsen no. 209-07-85, Jun. 1985 Kocur P... Kocur P.: Sensitivity analysis of reliability parameters. Proceedings of conf. FTSD, Brno, Jun. 1986, pp. 97-101 Smrha P., Kocur P., Racek S.: A

  10. Remarks on neurocybernetics and its links to computing science. To the memory of Prof. Luigi M. Ricciardi.

    PubMed

    Moreno-Díaz, Roberto; Moreno-Díaz, Arminda

    2013-06-01

    This paper explores the origins and content of neurocybernetics and its links to artificial intelligence, computer science and knowledge engineering. Starting with three remarkable pieces of work, we center attention on a number of events that initiated and developed basic topics that are still nowadays a matter of research and inquire, from goal directed activity theories to circular causality and to reverberations and learning. Within this context, we pay tribute to the memory of Prof. Ricciardi documenting the importance of his contributions in the mathematics of brain, neural nets and neurophysiological models, computational simulations and techniques. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Biomaterial science meets computational biology.

    PubMed

    Hutmacher, Dietmar W; Little, J Paige; Pettet, Graeme J; Loessner, Daniela

    2015-05-01

    There is a pressing need for a predictive tool capable of revealing a holistic understanding of fundamental elements in the normal and pathological cell physiology of organoids in order to decipher the mechanoresponse of cells. Therefore, the integration of a systems bioengineering approach into a validated mathematical model is necessary to develop a new simulation tool. This tool can only be innovative by combining biomaterials science with computational biology. Systems-level and multi-scale experimental data are incorporated into a single framework, thus representing both single cells and collective cell behaviour. Such a computational platform needs to be validated in order to discover key mechano-biological factors associated with cell-cell and cell-niche interactions.

  12. Novel 3D/VR interactive environment for MD simulations, visualization and analysis.

    PubMed

    Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P

    2014-12-18

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.

  13. Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis

    PubMed Central

    Doblack, Benjamin N.; Allis, Tim; Dávila, Lilian P.

    2014-01-01

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced. PMID:25549300

  14. Real science at the petascale.

    PubMed

    Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V

    2009-06-28

    We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.

  15. Effect of Current Electricity Simulation Supported Learning on the Conceptual Understanding of Elementary and Secondary Teachers

    ERIC Educational Resources Information Center

    Kumar, David Devraj; Thomas, P. V.; Morris, John D.; Tobias, Karen M.; Baker, Mary; Jermanovich, Trudy

    2011-01-01

    This study examined the impact of computer simulation and supported science learning on a teacher's understanding and conceptual knowledge of current electricity. Pre/Post tests were used to measure the teachers' concept attainment. Overall, there was a significant and large knowledge difference effect from Pre to Post test. Two interesting…

  16. Computational Social Creativity.

    PubMed

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  17. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less

  19. CSM research: Methods and application studies

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    1989-01-01

    Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.

  20. Variance in binary stellar population synthesis

    NASA Astrophysics Data System (ADS)

    Breivik, Katelyn; Larson, Shane L.

    2016-03-01

    In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations in less than a week, thus allowing a full exploration of the variance associated with a binary stellar evolution model.

  1. Studying Variance in the Galactic Ultra-compact Binary Population

    NASA Astrophysics Data System (ADS)

    Larson, Shane L.; Breivik, Katelyn

    2017-01-01

    In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations on week-long timescales, thus allowing a full exploration of the variance associated with a binary stellar evolution model.

  2. Comparison of meaningful learning characteristics in simulated nursing practice after traditional versus computer-based simulation method: a qualitative videography study.

    PubMed

    Poikela, Paula; Ruokamo, Heli; Teräs, Marianne

    2015-02-01

    Nursing educators must ensure that nursing students acquire the necessary competencies; finding the most purposeful teaching methods and encouraging learning through meaningful learning opportunities is necessary to meet this goal. We investigated student learning in a simulated nursing practice using videography. The purpose of this paper is to examine how two different teaching methods presented students' meaningful learning in a simulated nursing experience. The 6-hour study was divided into three parts: part I, general information; part II, training; and part III, simulated nursing practice. Part II was delivered by two different methods: a computer-based simulation and a lecture. The study was carried out in the simulated nursing practice in two universities of applied sciences, in Northern Finland. The participants in parts II and I were 40 first year nursing students; 12 student volunteers continued to part III. Qualitative analysis method was used. The data were collected using video recordings and analyzed by videography. The students who used a computer-based simulation program were more likely to report meaningful learning themes than those who were first exposed to lecture method. Educators should be encouraged to use computer-based simulation teaching in conjunction with other teaching methods to ensure that nursing students are able to receive the greatest educational benefits. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Extreme Scale Computing to Secure the Nation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; McGraw, J R; Johnson, J R

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less

  4. The melting temperature of liquid water with the effective fragment potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brorsen, Kurt R.; Willow, Soohaeng Y.; Xantheas, Sotiris S.

    2015-09-17

    Direct simulation of the solid-liquid water interface with the effective fragment potential (EFP) via the constant enthalpy and pressure (NPH) ensemble was used to estimate the melting temperature (Tm) of ice-Ih. Initial configurations and velocities, taken from equilibrated constant pressure and temperature (NPT) simulations at T = 300 K, 350 K and 400 K, respectively, yielded corresponding Tm values of 378±16 K, 382±14 K and 384±15 K. These estimates are consistently higher than experiment, albeit to the same degree with previously reported estimates using density functional theory (DFT)-based Born-Oppenheimer simulations with the Becke-Lee-Yang-Parr functional plus dispersion corrections (BLYP-D). KRB wasmore » supported by a Computational Science Graduate Fellowship from the Department of Energy. MSG was supported by a U.S. National Science Foundation Software Infrastructure (SI2) grant (ACI – 1047772). SSX acknowledges support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle.« less

  5. Animals in Science Education--Ethics and Alternatives.

    ERIC Educational Resources Information Center

    Langley, G. R.

    1991-01-01

    Summarizes the animal rights argument that objects to the use or killing of animals for educational purposes. Reviews and evaluates alternative approaches that include the nonanimal options of videotaped experiments, self-experimentation, and computer simulations. (MDH)

  6. United States Air Force Research Initiation Program for 1988. Volume 2

    DTIC Science & Technology

    1990-04-01

    Specialty: Modeling and Simulation ENGINEERING AND SERVICES CENTER (Tyndall Air Force Base) Dr. Wayne A. Charlie Dr. Peter Jeffers (1987) Colorado State...Michael Sydor University of New Hampshire University of Minnesota Specialty: Systems Modeling & Controls Specialty: Optics, Material Science Dr. John...9MG-025 4 Modeling and Simulation on Micro- Dr. Joseph J. Feeley (1987) computers, 1989 760-7MG-070 5 Two Dimensional MHD Simulation of Dr. Manuel A

  7. Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Way, David W.; Davis, J. L.; Shidner, Jeremy D.

    2013-01-01

    On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.

  8. Preliminary assessment of the Mars Science Laboratory entry, descent, and landing simulation

    NASA Astrophysics Data System (ADS)

    Way, David W.

    On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and the novel Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the Entry, Descent, and Landing system.

  9. Preliminary Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Way, David W.

    2013-01-01

    On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multibody computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.

  10. The Influence of Using Momentum and Impulse Computer Simulation to Senior High School Students’ Concept Mastery

    NASA Astrophysics Data System (ADS)

    Kaniawati, I.; Samsudin, A.; Hasopa, Y.; Sutrisno, A. D.; Suhendi, E.

    2016-08-01

    This research is based on students’ lack of mastery of physics abstract concepts. Thus, this study aims to improve senior high school students’ mastery of momentum and impulse concepts with the use of computer simulation. To achieve these objectives, the research method employed was pre experimental design with one group pre-test post-test. A total of 36 science students of grade 11 in one of public senior high school in Bandung became the sample in this study. The instruments utilized to determine the increase of students’ concept mastery were pretest and posttest in the form of multiple choices. After using computer simulations in physics learning, students’ mastery of momentum and impulse concept has increased as indicated by the normalized gain of 0.64 with the medium category.

  11. Materials integrity in microsystems: a framework for a petascale predictive-science-based multiscale modeling and simulation system

    NASA Astrophysics Data System (ADS)

    To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian

    2008-09-01

    Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.

  12. Fundamentals and Recent Developments in Approximate Bayesian Computation

    PubMed Central

    Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka

    2017-01-01

    Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922

  13. Extraordinary Tools for Extraordinary Science: The Impact ofSciDAC on Accelerator Science&Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert D.

    2006-08-10

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook''. Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now takemore » hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.« less

  14. Extraordinary tools for extraordinary science: the impact of SciDAC on accelerator science and technology

    NASA Astrophysics Data System (ADS)

    Ryne, Robert D.

    2006-09-01

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook.'' Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now take hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.

  15. Development of space simulation / net-laboratory system

    NASA Astrophysics Data System (ADS)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  16. Citizen science: A new perspective to evaluate spatial patterns in hydrology.

    NASA Astrophysics Data System (ADS)

    Koch, J.; Stisen, S.

    2016-12-01

    Citizen science opens new pathways that can complement traditional scientific practice. Intuition and reasoning make humans often more effective than computer algorithms in various realms of problem solving. In particular, a simple visual comparison of spatial patterns is a task where humans are often considered to be more reliable than computer algorithms. However, in practice, science still largely depends on computer based solutions, which is inevitable giving benefits such as speed and the possibility to automatize processes. This study highlights the integration of the generally underused human resource into hydrology. We established a citizen science project on the zooniverse platform entitled Pattern Perception. The aim is to employ the human perception to rate similarity and dissimilarity between simulated spatial patterns of a hydrological catchment model. In total, the turnout counts more than 2,800 users that provided over 46,000 classifications of 1,095 individual subjects within 64 days after the launch. Each subject displays simulated spatial patterns of land-surface variables of a baseline model and six modelling scenarios. The citizen science data discloses a numeric pattern similarity score for each of the scenarios with respect to the reference. We investigate the capability of a set of innovative statistical performance metrics to mimic the human perception to distinguish between similarity and dissimilarity. Results suggest that more complex metrics are not necessarily better at emulating the human perception, but clearly provide flexibility and auxiliary information that is valuable for model diagnostics. The metrics clearly differ in their ability to unambiguously distinguish between similar and dissimilar patterns which is regarded a key feature of a reliable metric.

  17. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  18. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    PubMed

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  19. Population Simulation, AKA: Grahz, Rahbitz and Fawkzes

    NASA Technical Reports Server (NTRS)

    Bangert, Tyler R.

    2008-01-01

    In an effort to give students a more visceral experience of science and instill a deeper working knowledge of concepts, activities that utilize hands-on, laboratory and simulated experiences are recommended because these activities have a greater impact on student learning, especially for Native American students. Because it is not usually feasible to take large and/or multiple classes of high school science students into the field to count numbers of organisms of a particular species, especially over a long period of time and covering a large area of an environment, the population simulation presented in this paper was created to aid students in understanding population dynamics by working with a simulated environment, which can be done in the classroom. Students create an environment and populate the environment with imaginary species. Then, using a sequence of "rules" that allow organisms to eat, reproduce, move and age, students see how the population of a species changes over time. In particular, students practice collecting data, summarizing information, plotting graphs, and interpreting graphs for such information as carrying capacity, predator prey relationships, and how specific species factors impact population and the environment. Students draw conclusions from their results and suggest further research, which may involve changes in simulation parameters, prediction of outcomes, and testing predictions. The population Simulation has demonstrated success in the above student activities using a "board game" version of the population simulation. A computer version of the population simulation needs more testing, but preliminary runs are promising. A second - and more complicated - computer simulation will simulate the same things and will add simulated population genetics.

  20. Here and now: the intersection of computational science, quantum-mechanical simulations, and materials science

    NASA Astrophysics Data System (ADS)

    Marzari, Nicola

    The last 30 years have seen the steady and exhilarating development of powerful quantum-simulation engines for extended systems, dedicated to the solution of the Kohn-Sham equations of density-functional theory, often augmented by density-functional perturbation theory, many-body perturbation theory, time-dependent density-functional theory, dynamical mean-field theory, and quantum Monte Carlo. Their implementation on massively parallel architectures, now leveraging also GPUs and accelerators, has started a massive effort in the prediction from first principles of many or of complex materials properties, leading the way to the exascale through the combination of HPC (high-performance computing) and HTC (high-throughput computing). Challenges and opportunities abound: complementing hardware and software investments and design; developing the materials' informatics infrastructure needed to encode knowledge into complex protocols and workflows of calculations; managing and curating data; resisting the complacency that we have already reached the predictive accuracy needed for materials design, or a robust level of verification of the different quantum engines. In this talk I will provide an overview of these challenges, with the ultimate prize being the computational understanding, prediction, and design of properties and performance for novel or complex materials and devices.

  1. Development of an Innovative Interactive Virtual Classroom System for K-12 Education Using Google App Engine

    ERIC Educational Resources Information Center

    Mumba, Frackson; Zhu, Mengxia

    2013-01-01

    This paper presents a Simulation-based interactive Virtual ClassRoom web system (SVCR: www.vclasie.com) powered by the state-of-the-art cloud computing technology from Google SVCR integrates popular free open-source math, science and engineering simulations and provides functions such as secure user access control and management of courses,…

  2. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  3. VISUAL-SEVEIF, a tool for integrating fire behavior simulation and economic evaluation of the impact of Wildfires

    Treesearch

    Francisco Rodríguez y Silva; Juan Ramón Molina Martínez; Miguel Ángel Herrera Machuca; Jesús Mª Rodríguez Leal

    2013-01-01

    Progress made in recent years in fire science, particularly as applied to forest fire protection, coupled with the increased power offered by mathematical processors integrated into computers, has led to important developments in the field of dynamic and static simulation of forest fires. Furthermore, and similarly, econometric models applied to economic...

  4. Preparing Students for Careers in Science and Industry with Computational Physics

    NASA Astrophysics Data System (ADS)

    Florinski, V. A.

    2011-12-01

    Funded by NSF CAREER grant, the University of Alabama (UAH) in Huntsville has launched a new graduate program in Computational Physics. It is universally accepted that today's physics is done on a computer. The program blends the boundary between physics and computer science by teaching student modern, practical techniques of solving difficult physics problems using diverse computational platforms. Currently consisting of two courses first offered in the Fall of 2011, the program will eventually include 5 courses covering methods for fluid dynamics, particle transport via stochastic methods, and hybrid and PIC plasma simulations. The UAH's unique location allows courses to be shaped through discussions with faculty, NASA/MSFC researchers and local R&D business representatives, i.e., potential employers of the program's graduates. Students currently participating in the program have all begun their research careers in space and plasma physics; many are presenting their research at this meeting.

  5. Computation material science of structural-phase transformation in casting aluminium alloys

    NASA Astrophysics Data System (ADS)

    Golod, V. M.; Dobosh, L. Yu

    2017-04-01

    Successive stages of computer simulation the formation of the casting microstructure under non-equilibrium conditions of crystallization of multicomponent aluminum alloys are presented. On the basis of computer thermodynamics and heat transfer during solidification of macroscale shaped castings are specified the boundary conditions of local heat exchange at mesoscale modeling of non-equilibrium formation the solid phase and of the component redistribution between phases during coalescence of secondary dendrite branches. Computer analysis of structural - phase transitions based on the principle of additive physico-chemical effect of the alloy components in the process of diffusional - capillary morphological evolution of the dendrite structure and the o of local dendrite heterogeneity which stochastic nature and extent are revealed under metallographic study and modeling by the Monte Carlo method. The integrated computational materials science tools at researches of alloys are focused and implemented on analysis the multiple-factor system of casting processes and prediction of casting microstructure.

  6. A High Performance Cloud-Based Protein-Ligand Docking Prediction Algorithm

    PubMed Central

    Chen, Jui-Le; Yang, Chu-Sing

    2013-01-01

    The potential of predicting druggability for a particular disease by integrating biological and computer science technologies has witnessed success in recent years. Although the computer science technologies can be used to reduce the costs of the pharmaceutical research, the computation time of the structure-based protein-ligand docking prediction is still unsatisfied until now. Hence, in this paper, a novel docking prediction algorithm, named fast cloud-based protein-ligand docking prediction algorithm (FCPLDPA), is presented to accelerate the docking prediction algorithm. The proposed algorithm works by leveraging two high-performance operators: (1) the novel migration (information exchange) operator is designed specially for cloud-based environments to reduce the computation time; (2) the efficient operator is aimed at filtering out the worst search directions. Our simulation results illustrate that the proposed method outperforms the other docking algorithms compared in this paper in terms of both the computation time and the quality of the end result. PMID:23762864

  7. Technology 2004, Vol. 2

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Proceedings from symposia of the Technology 2004 Conference, November 8-10, 1994, Washington, DC. Volume 2 features papers on computers and software, virtual reality simulation, environmental technology, video and imaging, medical technology and life sciences, robotics and artificial intelligence, and electronics.

  8. | NREL

    Science.gov Websites

    of NREL's Computational Science Center, where he uses electronic structure calculations and other introductory chemistry and physical chemistry. Research Interests Electronic structure and dynamics in the quantum/classical molecular dynamics simulation|Coupling of molecular electronic structure to

  9. X-ray system simulation software tools for radiology and radiography education.

    PubMed

    Kengyelics, Stephen M; Treadgold, Laura A; Davies, Andrew G

    2018-02-01

    To develop x-ray simulation software tools to support delivery of radiological science education for a range of learning environments and audiences including individual study, lectures, and tutorials. Two software tools were developed; one simulated x-ray production for a simple two dimensional radiographic system geometry comprising an x-ray source, beam filter, test object and detector. The other simulated the acquisition and display of two dimensional radiographic images of complex three dimensional objects using a ray casting algorithm through three dimensional mesh objects. Both tools were intended to be simple to use, produce results accurate enough to be useful for educational purposes, and have an acceptable simulation time on modest computer hardware. The radiographic factors and acquisition geometry could be altered in both tools via their graphical user interfaces. A comparison of radiographic contrast measurements of the simulators to a real system was performed. The contrast output of the simulators had excellent agreement with measured results. The software simulators were deployed to 120 computers on campus. The software tools developed are easy-to-use, clearly demonstrate important x-ray physics and imaging principles, are accessible within a standard University setting and could be used to enhance the teaching of x-ray physics to undergraduate students. Current approaches to teaching x-ray physics in radiological science lack immediacy when linking theory with practice. This method of delivery allows students to engage with the subject in an experiential learning environment. Copyright © 2017. Published by Elsevier Ltd.

  10. Impact of the Columbia Supercomputer on NASA Space and Exploration Mission

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Kwak, Dochan; Kiris, Cetin; Lawrence, Scott

    2006-01-01

    NASA's 10,240-processor Columbia supercomputer gained worldwide recognition in 2004 for increasing the space agency's computing capability ten-fold, and enabling U.S. scientists and engineers to perform significant, breakthrough simulations. Columbia has amply demonstrated its capability to accelerate NASA's key missions, including space operations, exploration systems, science, and aeronautics. Columbia is part of an integrated high-end computing (HEC) environment comprised of massive storage and archive systems, high-speed networking, high-fidelity modeling and simulation tools, application performance optimization, and advanced data analysis and visualization. In this paper, we illustrate the impact Columbia is having on NASA's numerous space and exploration applications, such as the development of the Crew Exploration and Launch Vehicles (CEV/CLV), effects of long-duration human presence in space, and damage assessment and repair recommendations for remaining shuttle flights. We conclude by discussing HEC challenges that must be overcome to solve space-related science problems in the future.

  11. Life sciences Spacelab Mission Development test 3 (SMD 3) data management report

    NASA Technical Reports Server (NTRS)

    Moseley, E. C.

    1977-01-01

    Development of a permanent data system for SMD tests was studied that would simulate all elements of the shuttle onboard, telemetry, and ground data systems that are involved with spacelab operations. The onboard data system (ODS) and the ground data system (GDS) were utilized. The air-to-ground link was simulated by a hardwired computer-to-computer interface. A patch board system was used on board to select experiment inputs, and the downlink configuration from the ODS was changed by a crew keyboard entry to support each experiment. The ODS provided a CRT display of experiment parameters to enable the crew to monitor experiment performance. An onboard analog system, with recording capability, was installed to handle high rate data and to provide a backup to the digital system. The GDS accomplished engineering unit conversion and limit sensing, and provided realtime parameter display on CRT's in the science monitoring area and the test control area.

  12. Engineering uses of physics-based ground motion simulations

    USGS Publications Warehouse

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  13. seismo-live: Training in Computational Seismology using Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Igel, H.; Krischer, L.; van Driel, M.; Tape, C.

    2016-12-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.

  14. Challenges in Computational Social Modeling and Simulation for National Security Decision Making

    DTIC Science & Technology

    2011-06-01

    This study is grounded within a system-activity theory , a logico-philosophical model of interdisciplinary research [13, 14], the concepts of social...often a difficult challenge. Ironically, social science research methods , such as ethnography , may be tremendously helpful in designing these...social sciences. Moreover, CSS projects draw on knowledge and methods from other fields of study , including graph theory , information visualization

  15. Peptoid Backbone Flexibilility Dictates Its Interaction with Water and Surfaces: A Molecular Dynamics Investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prakash, Arushi; Baer, Marcel D.; Mundy, Christopher J.

    Peptoids are peptide-mimetic biopolymers that are easy-to-synthesize and adaptable for use in drugs, chemical scaffolds, and coatings. However, there is insufficient information about their structural preferences and interactions with the environment in various applications. We conducted a study to understand the fundamental differences between peptides and peptoids using molecular dynamics simulations with semi-empirical (PM6) and empirical (AMBER) potentials, in conjunction with metadynamics enhanced sampling. From studies of single molecules in water and on surfaces, we found that sarcosine (model peptoid) is much more flexible than alanine (model peptide) in different environments. However, the sarcosine and alanine interact similarly with amore » hydrophobic or a hydrophilic. Finally, this study highlights the conformational landscape of peptoids and the dominant interactions that drive peptoids towards these conformations. ACKNOWLEDGMENT: MD simulations and manuscript preparation were supported by the MS3 (Materials Synthesis and Simulation Across Scales) Initiative at Pacific Northwest National Laboratory (PNNL), a multi-program national laboratory operated by Battelle for the U.S. Department of Energy. CJM was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by the US Department of Energy, Office of Basic Energy Sciences, Biomolecular Materials Program at PNNL. Computing resources were generously allocated by University of Washington's IT department and PNNL's Institutional Computing program. The authors greatly acknowledge conversations with Dr. Kayla Sprenger, Josh Smith, and Dr. Yeneneh Yimer.« less

  16. ALCF Data Science Program: Productive Data-centric Supercomputing

    NASA Astrophysics Data System (ADS)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  17. Emotion-affected decision making in human simulation.

    PubMed

    Zhao, Y; Kang, J; Wright, D K

    2006-01-01

    Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the emotion process with decision making. A computational emotion model is proposed, and the initial framework of this model in virtual human simulation within the platform of Virtools is presented.

  18. Agent-based modeling and systems dynamics model reproduction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Macal, C. M.

    2009-01-01

    Reproducibility is a pillar of the scientific endeavour. We view computer simulations as laboratories for electronic experimentation and therefore as tools for science. Recent studies have addressed model reproduction and found it to be surprisingly difficult to replicate published findings. There have been enough failed simulation replications to raise the question, 'can computer models be fully replicated?' This paper answers in the affirmative by reporting on a successful reproduction study using Mathematica, Repast and Swarm for the Beer Game supply chain model. The reproduction process was valuable because it demonstrated the original result's robustness across modelling methodologies and implementation environments.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolic, R J

    This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less

  20. Interplay of multiple synaptic plasticity features in filamentary memristive devices for neuromorphic computing

    NASA Astrophysics Data System (ADS)

    La Barbera, Selina; Vincent, Adrien F.; Vuillaume, Dominique; Querlioz, Damien; Alibart, Fabien

    2016-12-01

    Bio-inspired computing represents today a major challenge at different levels ranging from material science for the design of innovative devices and circuits to computer science for the understanding of the key features required for processing of natural data. In this paper, we propose a detail analysis of resistive switching dynamics in electrochemical metallization cells for synaptic plasticity implementation. We show how filament stability associated to joule effect during switching can be used to emulate key synaptic features such as short term to long term plasticity transition and spike timing dependent plasticity. Furthermore, an interplay between these different synaptic features is demonstrated for object motion detection in a spike-based neuromorphic circuit. System level simulation presents robust learning and promising synaptic operation paving the way to complex bio-inspired computing systems composed of innovative memory devices.

  1. 'Towers in the Tempest' Computer Animation Submission

    NASA Technical Reports Server (NTRS)

    Shirah, Greg

    2008-01-01

    The following describes a computer animation that has been submitted to the ACM/SIGGRAPH 2008 computer graphics conference: 'Towers in the Tempest' clearly communicates recent scientific research into how hurricanes intensify. This intensification can be caused by a phenomenon called a 'hot tower.' For the first time, research meteorologists have run complex atmospheric simulations at a very fine temporal resolution of 3 minutes. Combining this simulation data with satellite observations enables detailed study of 'hot towers.' The science of 'hot towers' is described using: satellite observation data, conceptual illustrations, and a volumetric atmospheric simulation data. The movie starts by showing a 'hot tower' observed by NASA's Tropical Rainfall Measuring Mission (TRMM) spacecraft's three dimensional precipitation radar data of Hurricane Bonnie. Next, the dynamics of a hurricane and the formation of 'hot towers' are briefly explained using conceptual illustrations. Finally, volumetric cloud, wind, and vorticity data from a supercomputer simulation of Hurricane Bonnie are shown using volume techniques such as ray marching.

  2. Plasma Science and Innovation Center (PSI-Center) at Washington, Wisconsin, and Utah State, ARRA Supplement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sovinec, Carl

    The objective of the Plasma Science and Innovation Center (PSI-Center) is to develop and deploy computational models that simulate conditions in smaller, concept-exploration plasma experiments. The PSIC group at the University of Wisconsin-Madison, led by Prof. Carl Sovinec, uses and enhances the Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion (NIMROD) code, to simulate macroscopic plasma dynamics in a number of magnetic confinement configurations. These numerical simulations provide information on how magnetic fields and plasma flows evolve over all three spatial dimensions, which supplements the limited access of diagnostics in plasma experiments. The information gained from simulation helps explain how plasma evolves.more » It is also used to engineer more effective plasma confinement systems, reducing the need for building many experiments to cover the physical parameter space. The ultimate benefit is a more cost-effective approach to the development of fusion energy for peaceful power production. The supplemental funds provided by the American Recovery and Reinvestment Act of 2009 were used to purchase computer components that were assembled into a 48-core system with 256 Gb of shared memory. The system was engineered and constructed by the group's system administrator at the time, Anthony Hammond. It was successfully used by then graduate student, Dr. John O'Bryan, for computing magnetic relaxation dynamics that occur during experimental tests of non-inductive startup in the Pegasus Toroidal Experiment (pegasus.ep.wisc.edu). Dr. O'Bryan's simulations provided the first detailed explanation of how the driven helical filament of electrical current evolves into a toroidal tokamak-like plasma configuration.« less

  3. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brantley, Patrick; Dawson, Shawn; McKinley, Scott

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, wemore » also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.« less

  4. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meisner, Robert; McCoy, Michel; Archer, Bill

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less

  5. High-order hydrodynamic algorithms for exascale computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, Nathaniel Ray

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broadmore » range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.« less

  6. Introducing computational thinking through hands-on projects using R with applications to calculus, probability and data analysis

    NASA Astrophysics Data System (ADS)

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-04-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.

  7. Influence of savanna fire on Australian monsoon season precipitation and circulation as simulated using a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Lynch, Amanda H.; Abramson, David; Görgen, Klaus; Beringer, Jason; Uotila, Petteri

    2007-10-01

    Fires in the Australian savanna have been hypothesized to affect monsoon evolution, but the hypothesis is controversial and the effects have not been quantified. A distributed computing approach allows the development of a challenging experimental design that permits simultaneous variation of all fire attributes. The climate model simulations are distributed around multiple independent computer clusters in six countries, an approach that has potential for a range of other large simulation applications in the earth sciences. The experiment clarifies that savanna burning can shape the monsoon through two mechanisms. Boundary-layer circulation and large-scale convergence is intensified monotonically through increasing fire intensity and area burned. However, thresholds of fire timing and area are evident in the consequent influence on monsoon rainfall. In the optimal band of late, high intensity fires with a somewhat limited extent, it is possible for the wet season to be significantly enhanced.

  8. High-Performance Computation of Distributed-Memory Parallel 3D Voronoi and Delaunay Tessellation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterka, Tom; Morozov, Dmitriy; Phillips, Carolyn

    2014-11-14

    Computing a Voronoi or Delaunay tessellation from a set of points is a core part of the analysis of many simulated and measured datasets: N-body simulations, molecular dynamics codes, and LIDAR point clouds are just a few examples. Such computational geometry methods are common in data analysis and visualization; but as the scale of simulations and observations surpasses billions of particles, the existing serial and shared-memory algorithms no longer suffice. A distributed-memory scalable parallel algorithm is the only feasible approach. The primary contribution of this paper is a new parallel Delaunay and Voronoi tessellation algorithm that automatically determines which neighbormore » points need to be exchanged among the subdomains of a spatial decomposition. Other contributions include periodic and wall boundary conditions, comparison of our method using two popular serial libraries, and application to numerous science datasets.« less

  9. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    NASA Astrophysics Data System (ADS)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  10. Accelerating Science with Generative Adversarial Networks: An Application to 3D Particle Showers in Multilayer Calorimeters

    NASA Astrophysics Data System (ADS)

    Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin

    2018-01-01

    Physicists at the Large Hadron Collider (LHC) rely on detailed simulations of particle collisions to build expectations of what experimental data may look like under different theoretical modeling assumptions. Petabytes of simulated data are needed to develop analysis techniques, though they are expensive to generate using existing algorithms and computing resources. The modeling of detectors and the precise description of particle cascades as they interact with the material in the calorimeter are the most computationally demanding steps in the simulation pipeline. We therefore introduce a deep neural network-based generative model to enable high-fidelity, fast, electromagnetic calorimeter simulation. There are still challenges for achieving precision across the entire phase space, but our current solution can reproduce a variety of particle shower properties while achieving speedup factors of up to 100 000 × . This opens the door to a new era of fast simulation that could save significant computing time and disk space, while extending the reach of physics searches and precision measurements at the LHC and beyond.

  11. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  12. Teaching Animal Physiology: A 12-Year Experience Transitioning from a Classical to Interactive Approach with Continual Assessment and Computer Alternatives

    ERIC Educational Resources Information Center

    Kaisarevic, Sonja N.; Andric, Silvana A.; Kostic, Tatjana S.

    2017-01-01

    In response to the Bologna Declaration and contemporary trends in Animal Physiology education, the Animal Physiology course at the Faculty of Sciences, University of Novi Sad, Serbia, has evolved over a 12-year period (2001-2012): from a classical two-semester course toward a one-semester course utilizing computer simulations of animal…

  13. Computational cost of two alternative formulations of Cahn-Hilliard equations

    NASA Astrophysics Data System (ADS)

    Paszyński, Maciej; Gurgul, Grzegorz; Łoś, Marcin; Szeliga, Danuta

    2018-05-01

    In this paper we propose two formulations of Cahn-Hilliard equations, which have several applications in cancer growth modeling and material science phase-field simulations. The first formulation uses one C4 partial differential equations (PDEs) the second one uses two C2 PDEs. Finally, we compare the computational costs of direct solvers for both formulations, using the refined isogeometric analysis (rIGA) approach.

  14. Natural Tasking of Robots Based on Human Interaction Cues

    DTIC Science & Technology

    2005-06-01

    MIT. • Matthew Marjanovic , researcher, ITA Software. • Brian Scasselatti, Assistant Professor of Computer Science, Yale. • Matthew Williamson...2004. 25 [74] Charlie C. Kemp. Shoes as a platform for vision. 7th IEEE International Symposium on Wearable Computers, 2004. [75] Matthew Marjanovic ...meso: Simulated muscles for a humanoid robot. Presentation for Humanoid Robotics Group, MIT AI Lab, August 2001. [76] Matthew J. Marjanovic . Teaching

  15. Molecular dynamics simulations through GPU video games technologies

    PubMed Central

    Loukatou, Styliani; Papageorgiou, Louis; Fakourelis, Paraskevas; Filntisi, Arianna; Polychronidou, Eleftheria; Bassis, Ioannis; Megalooikonomou, Vasileios; Makałowski, Wojciech; Vlachakis, Dimitrios; Kossida, Sophia

    2016-01-01

    Bioinformatics is the scientific field that focuses on the application of computer technology to the management of biological information. Over the years, bioinformatics applications have been used to store, process and integrate biological and genetic information, using a wide range of methodologies. One of the most de novo techniques used to understand the physical movements of atoms and molecules is molecular dynamics (MD). MD is an in silico method to simulate the physical motions of atoms and molecules under certain conditions. This has become a state strategic technique and now plays a key role in many areas of exact sciences, such as chemistry, biology, physics and medicine. Due to their complexity, MD calculations could require enormous amounts of computer memory and time and therefore their execution has been a big problem. Despite the huge computational cost, molecular dynamics have been implemented using traditional computers with a central memory unit (CPU). A graphics processing unit (GPU) computing technology was first designed with the goal to improve video games, by rapidly creating and displaying images in a frame buffer such as screens. The hybrid GPU-CPU implementation, combined with parallel computing is a novel technology to perform a wide range of calculations. GPUs have been proposed and used to accelerate many scientific computations including MD simulations. Herein, we describe the new methodologies developed initially as video games and how they are now applied in MD simulations. PMID:27525251

  16. Stretching the Traditional Notion of Experiment in Computing: Explorative Experiments.

    PubMed

    Schiaffonati, Viola

    2016-06-01

    Experimentation represents today a 'hot' topic in computing. If experiments made with the support of computers, such as computer simulations, have received increasing attention from philosophers of science and technology, questions such as "what does it mean to do experiments in computer science and engineering and what are their benefits?" emerged only recently as central in the debate over the disciplinary status of the discipline. In this work we aim at showing, also by means of paradigmatic examples, how the traditional notion of controlled experiment should be revised to take into account a part of the experimental practice in computing along the lines of experimentation as exploration. Taking inspiration from the discussion on exploratory experimentation in the philosophy of science-experimentation that is not theory-driven-we advance the idea of explorative experiments that, although not new, can contribute to enlarge the debate about the nature and role of experimental methods in computing. In order to further refine this concept we recast explorative experiments as socio-technical experiments, that test new technologies in their socio-technical contexts. We suggest that, when experiments are explorative, control should be intended in a posteriori form, in opposition to the a priori form that usually takes place in traditional experimental contexts.

  17. Teaching Strategies for Using Projected Images to Develop Conceptual Understanding: Exploring Discussion Practices in Computer Simulation and Static Image-Based Lessons

    ERIC Educational Resources Information Center

    Price, Norman T.

    2013-01-01

    The availability and sophistication of visual display images, such as simulations, for use in science classrooms has increased exponentially however, it can be difficult for teachers to use these images to encourage and engage active student thinking. There is a need to describe flexible discussion strategies that use visual media to engage active…

  18. Standardized input for Hanford environmental impact statements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.

    1981-05-01

    Models and computer programs for simulating the environmental behavior of radionuclides in the environment and the resulting radiation dose to humans have been developed over the years by the Environmental Analysis Section staff, Ecological Sciences Department at the Pacific Northwest Laboratory (PNL). Methodologies have evolved for calculating raidation doses from many exposure pathways for any type of release mechanism. Depending on the situation or process being simulated, different sets of computer programs, assumptions, and modeling techniques must be used. This report is a compilation of recommended computer programs and necessary input information for use in calculating doses to members ofmore » the general public for environmental impact statements prepared for DOE activities to be conducted on or near the Hanford Reservation.« less

  19. Applications of hybrid and digital computation methods in aerospace-related sciences and engineering. [problem solving methods at the University of Houston

    NASA Technical Reports Server (NTRS)

    Huang, C. J.; Motard, R. L.

    1978-01-01

    The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.

  20. Dawn: A Simulation Model for Evaluating Costs and Tradeoffs of Big Data Science Architectures

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Crichton, D. J.; Braverman, A. J.; Kyo, L.; Fuchs, T.; Turmon, M.

    2014-12-01

    In many scientific disciplines, scientists and data managers are bracing for an upcoming deluge of big data volumes, which will increase the size of current data archives by a factor of 10-100 times. For example, the next Climate Model Inter-comparison Project (CMIP6) will generate a global archive of model output of approximately 10-20 Peta-bytes, while the upcoming next generation of NASA decadal Earth Observing instruments are expected to collect tens of Giga-bytes/day. In radio-astronomy, the Square Kilometre Array (SKA) will collect data in the Exa-bytes/day range, of which (after reduction and processing) around 1.5 Exa-bytes/year will be stored. The effective and timely processing of these enormous data streams will require the design of new data reduction and processing algorithms, new system architectures, and new techniques for evaluating computation uncertainty. Yet at present no general software tool or framework exists that will allow system architects to model their expected data processing workflow, and determine the network, computational and storage resources needed to prepare their data for scientific analysis. In order to fill this gap, at NASA/JPL we have been developing a preliminary model named DAWN (Distributed Analytics, Workflows and Numerics) for simulating arbitrary complex workflows composed of any number of data processing and movement tasks. The model can be configured with a representation of the problem at hand (the data volumes, the processing algorithms, the available computing and network resources), and is able to evaluate tradeoffs between different possible workflows based on several estimators: overall elapsed time, separate computation and transfer times, resulting uncertainty, and others. So far, we have been applying DAWN to analyze architectural solutions for 4 different use cases from distinct science disciplines: climate science, astronomy, hydrology and a generic cloud computing use case. This talk will present preliminary results and discuss how DAWN can be evolved into a powerful tool for designing system architectures for data intensive science.

  1. e-Science and data management resources on the Web.

    PubMed

    Gore, Sally A

    2011-01-01

    The way research is conducted has changed over time, from simple experiments to computer modeling and simulation, from individuals working in isolated laboratories to global networks of researchers collaborating on a single topic. Often, this new paradigm results in the generation of staggering amounts of data. The intensive use of data and the existence of networks of researchers characterize e-Science. The role of libraries and librarians in e-Science has been a topic of interest for some time now. This column looks at tools, resources, and projects that demonstrate successful collaborations between libraries and researchers in e-Science.

  2. Web-based multimedia courseware for emergency cardiac patient management simulations.

    PubMed

    Ambrosiadou, V; Compton, T; Panchal, T; Polovina, S

    2000-01-01

    This is a multidisciplinary inter-departmental/faculty project between the departments of computer science, electronic, communications and electrical engineering and nursing and paramedic sciences. The objective is to develop a web based multimedia front end to existing simulations of cardiac emergency scenaria. It will be used firstly in the teaching of nurses. The University of Hertfordshire is the only University in Britain using simulations of cardiac emergency scenaria for nurse and paramedic science education and therefore this project will add the multimedia dimension in distributed courses over the web and will assess the improvement in the educational process. The use of network and multimedia technologies, provide interactive learning, immediate feedback to students' responses, individually tailored instructions, objective testing and entertaining delivery. The end product of this project will serve as interactive material to enhance experiential learning for nursing students using the simulations of cardiac emergency scenaria. The emergency treatment simulations have been developed using VisSim and may be compiled as C code. The objective of the project is to provide a web based user friendly multimedia interface in order to demonstrate the way in which patients may be managed in critical situations by applying advanced technological equipment and drug administration. Then the user will be able to better appreciate the concepts involved by running the VisSim simulations. The evaluation group for the proposed software will be the Department of Nursing and Paramedic Sciences About 200 nurses use simulations every year for training purposes as part of their course requirements.

  3. The (human) science of medical virtual learning environments

    PubMed Central

    Stone, Robert J.

    2011-01-01

    The uptake of virtual simulation technologies in both military and civilian surgical contexts has been both slow and patchy. The failure of the virtual reality community in the 1990s and early 2000s to deliver affordable and accessible training systems stems not only from an obsessive quest to develop the ‘ultimate’ in so-called ‘immersive’ hardware solutions, from head-mounted displays to large-scale projection theatres, but also from a comprehensive lack of attention to the needs of the end users. While many still perceive the science of simulation to be defined by technological advances, such as computing power, specialized graphics hardware, advanced interactive controllers, displays and so on, the true science underpinning simulation—the science that helps to guarantee the transfer of skills from the simulated to the real—is that of human factors, a well-established discipline that focuses on the abilities and limitations of the end user when designing interactive systems, as opposed to the more commercially explicit components of technology. Based on three surgical simulation case studies, the importance of a human factors approach to the design of appropriate simulation content and interactive hardware for medical simulation is illustrated. The studies demonstrate that it is unnecessary to pursue real-world fidelity in all instances in order to achieve psychological fidelity—the degree to which the simulated tasks reproduce and foster knowledge, skills and behaviours that can be reliably transferred to real-world training applications. PMID:21149363

  4. A SLAM II simulation model for analyzing space station mission processing requirements

    NASA Technical Reports Server (NTRS)

    Linton, D. G.

    1985-01-01

    Space station mission processing is modeled via the SLAM 2 simulation language on an IBM 4381 mainframe and an IBM PC microcomputer with 620K RAM, two double-sided disk drives and an 8087 coprocessor chip. Using a time phased mission (payload) schedule and parameters associated with the mission, orbiter (space shuttle) and ground facility databases, estimates for ground facility utilization are computed. Simulation output associated with the science and applications database is used to assess alternative mission schedules.

  5. Decision Support Tool for Deep Energy Efficiency Retrofits in DoD Installations

    DTIC Science & Technology

    2014-01-01

    representations (HDMR). Chemical Engineering Science, 57, 4445–4460. 2. Sobol ’, I., 2001. Global sensitivity indices for nonlinear mathematical...models and their Monte Carlo estimates. Mathematics and computers in simulation, 55, 271–280. 3. Sobol , I. and Kucherenko, S., 2009. Derivative based...representations (HDMR). Chemical Engineering Science, 57, 4445–4460. 16. Sobol ’, I., 2001. Global sensitivity indices for nonlinear mathematical models and

  6. VERCE: a productive e-Infrastructure and e-Science environment for data-intensive seismology research

    NASA Astrophysics Data System (ADS)

    Vilotte, J. P.; Atkinson, M.; Spinuso, A.; Rietbrock, A.; Michelini, A.; Igel, H.; Frank, A.; Carpené, M.; Schwichtenberg, H.; Casarotti, E.; Filgueira, R.; Garth, T.; Germünd, A.; Klampanos, I.; Krause, A.; Krischer, L.; Leong, S. H.; Magnoni, F.; Matser, J.; Moguilny, G.

    2015-12-01

    Seismology addresses both fundamental problems in understanding the Earth's internal wave sources and structures and augmented societal applications, like earthquake and tsunami hazard assessment and risk mitigation; and puts a premium on open-data accessible by the Federated Digital Seismological Networks. The VERCE project, "Virtual Earthquake and seismology Research Community e-science environment in Europe", has initiated a virtual research environment to support complex orchestrated workflows combining state-of-art wave simulation codes and data analysis tools on distributed computing and data infrastructures (DCIs) along with multiple sources of observational data and new capabilities to combine simulation results with observational data. The VERCE Science Gateway provides a view of all the available resources, supporting collaboration with shared data and methods, with data access controls. The mapping to DCIs handles identity management, authority controls, transformations between representations and controls, and access to resources. The framework for computational science that provides simulation codes, like SPECFEM3D, democratizes their use by getting data from multiple sources, managing Earth models and meshes, distilling them as input data, and capturing results with meta-data. The dispel4py data-intensive framework allows for developing data-analysis applications using Python and the ObsPy library, which can be executed on different DCIs. A set of tools allows coupling with seismology and external data services. Provenance driven tools validate results and show relationships between data to facilitate method improvement. Lessons learned from VERCE training lead us to conclude that solid-Earth scientists could make significant progress by using VERCE e-science environment. VERCE has already contributed to the European Plate Observation System (EPOS), and is part of the EPOS implementation phase. Its cross-disciplinary capabilities are being extended for the EPOS implantation phase.

  7. Sandia National Laboratories: Advanced Simulation and Computing

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  8. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1987

    1987-01-01

    Contains 21 articles ranging from instructional experiments to topical information. Deals with investigation of plant rust diseases, using computers to teach biology, plant roots, a biotechnology curriculum, the corrosion of oxides, electrochemical simulations, the Reimer-Tiemann reaction, the oxidation of aldehydes, and the extraction of iodine…

  9. The effectiveness of using computer simulated experiments on junior high students' understanding of the volume displacement concept

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soon; Gennaro, Eugene

    Several researchers have suggested that the computer holds much promise as a tool for science teachers for use in their classrooms (Bork, 1979, Lunetta & Hofstein, 1981). It also has been said that there needs to be more research in determining the effectiveness of computer software (Tinker, 1983).This study compared the effectiveness of microcomputer simulated experiences with that of parallel instruction involving hands-on laboratory experiences for teaching the concept of volume displacement to junior high school students. This study also assessed the differential effect on students' understanding of the volume displacement concept using sex of the students as another independent variable. In addition, it compared the degree of retention, after 45 days, of both treatment groups.It was found that computer simulated experiences were as effective as hands-on laboratory experiences, and that males, having had hands-on laboratory experiences, performed better on the posttest than females having had the hands-on laboratory experiences. There were no significant differences in performance when comparing males with females using the computer simulation in the learning of the displacement concept. This study also showed that there were no significant differences in the retention levels when the retention scores of the computer simulation groups were compared to those that had the hands-on laboratory experiences. However, an ANOVA of the retention test scores revealed that males in both treatment conditions retained knowledge of volume displacement better than females.

  10. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  11. Leveraging e-Science infrastructure for electrochemical research.

    PubMed

    Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F

    2011-08-28

    As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.

  12. The computational challenges of Earth-system science.

    PubMed

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mattsson, Ann E.

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less

  14. Patient flow within UK emergency departments: a systematic review of the use of computer simulation modelling methods

    PubMed Central

    Mohiuddin, Syed; Busby, John; Savović, Jelena; Richards, Alison; Northstone, Kate; Hollingworth, William; Donovan, Jenny L; Vasilakis, Christos

    2017-01-01

    Objectives Overcrowding in the emergency department (ED) is common in the UK as in other countries worldwide. Computer simulation is one approach used for understanding the causes of ED overcrowding and assessing the likely impact of changes to the delivery of emergency care. However, little is known about the usefulness of computer simulation for analysis of ED patient flow. We undertook a systematic review to investigate the different computer simulation methods and their contribution for analysis of patient flow within EDs in the UK. Methods We searched eight bibliographic databases (MEDLINE, EMBASE, COCHRANE, WEB OF SCIENCE, CINAHL, INSPEC, MATHSCINET and ACM DIGITAL LIBRARY) from date of inception until 31 March 2016. Studies were included if they used a computer simulation method to capture patient progression within the ED of an established UK National Health Service hospital. Studies were summarised in terms of simulation method, key assumptions, input and output data, conclusions drawn and implementation of results. Results Twenty-one studies met the inclusion criteria. Of these, 19 used discrete event simulation and 2 used system dynamics models. The purpose of many of these studies (n=16; 76%) centred on service redesign. Seven studies (33%) provided no details about the ED being investigated. Most studies (n=18; 86%) used specific hospital models of ED patient flow. Overall, the reporting of underlying modelling assumptions was poor. Nineteen studies (90%) considered patient waiting or throughput times as the key outcome measure. Twelve studies (57%) reported some involvement of stakeholders in the simulation study. However, only three studies (14%) reported on the implementation of changes supported by the simulation. Conclusions We found that computer simulation can provide a means to pretest changes to ED care delivery before implementation in a safe and efficient manner. However, the evidence base is small and poorly developed. There are some methodological, data, stakeholder, implementation and reporting issues, which must be addressed by future studies. PMID:28487459

  15. SIM_EXPLORE: Software for Directed Exploration of Complex Systems

    NASA Technical Reports Server (NTRS)

    Burl, Michael; Wang, Esther; Enke, Brian; Merline, William J.

    2013-01-01

    Physics-based numerical simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. While such codes may provide the highest- fidelity representation of system behavior, they are often so slow to run that insight into the system is limited. Trying to understand the effects of inputs on outputs by conducting an exhaustive grid-based sweep over the input parameter space is simply too time-consuming. An alternative approach called "directed exploration" has been developed to harvest information from numerical simulators more efficiently. The basic idea is to employ active learning and supervised machine learning to choose cleverly at each step which simulation trials to run next based on the results of previous trials. SIM_EXPLORE is a new computer program that uses directed exploration to explore efficiently complex systems represented by numerical simulations. The software sequentially identifies and runs simulation trials that it believes will be most informative given the results of previous trials. The results of new trials are incorporated into the software's model of the system behavior. The updated model is then used to pick the next round of new trials. This process, implemented as a closed-loop system wrapped around existing simulation code, provides a means to improve the speed and efficiency with which a set of simulations can yield scientifically useful results. The software focuses on the case in which the feedback from the simulation trials is binary-valued, i.e., the learner is only informed of the success or failure of the simulation trial to produce a desired output. The software offers a number of choices for the supervised learning algorithm (the method used to model the system behavior given the results so far) and a number of choices for the active learning strategy (the method used to choose which new simulation trials to run given the current behavior model). The software also makes use of the LEGION distributed computing framework to leverage the power of a set of compute nodes. The approach has been demonstrated on a planetary science application in which numerical simulations are used to study the formation of asteroid families.

  16. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  17. Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources

    NASA Technical Reports Server (NTRS)

    Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John

    2000-01-01

    The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.

  18. OpenWorm: an open-science approach to modeling Caenorhabditis elegans.

    PubMed

    Szigeti, Balázs; Gleeson, Padraig; Vella, Michael; Khayrulin, Sergey; Palyanov, Andrey; Hokanson, Jim; Currie, Michael; Cantarelli, Matteo; Idili, Giovanni; Larson, Stephen

    2014-01-01

    OpenWorm is an international collaboration with the aim of understanding how the behavior of Caenorhabditis elegans (C. elegans) emerges from its underlying physiological processes. The project has developed a modular simulation engine to create computational models of the worm. The modularity of the engine makes it possible to easily modify the model, incorporate new experimental data and test hypotheses. The modeling framework incorporates both biophysical neuronal simulations and a novel fluid-dynamics-based soft-tissue simulation for physical environment-body interactions. The project's open-science approach is aimed at overcoming the difficulties of integrative modeling within a traditional academic environment. In this article the rationale is presented for creating the OpenWorm collaboration, the tools and resources developed thus far are outlined and the unique challenges associated with the project are discussed.

  19. Modeling Advance Life Support Systems

    NASA Technical Reports Server (NTRS)

    Pitts, Marvin; Sager, John; Loader, Coleen; Drysdale, Alan

    1996-01-01

    Activities this summer consisted of two projects that involved computer simulation of bioregenerative life support systems for space habitats. Students in the Space Life Science Training Program (SLSTP) used the simulation, space station, to learn about relationships between humans, fish, plants, and microorganisms in a closed environment. One student complete a six week project to modify the simulation by converting the microbes from anaerobic to aerobic, and then balancing the simulation's life support system. A detailed computer simulation of a closed lunar station using bioregenerative life support was attempted, but there was not enough known about system restraints and constants in plant growth, bioreactor design for space habitats and food preparation to develop an integrated model with any confidence. Instead of a completed detailed model with broad assumptions concerning the unknown system parameters, a framework for an integrated model was outlined and work begun on plant and bioreactor simulations. The NASA sponsors and the summer Fell were satisfied with the progress made during the 10 weeks, and we have planned future cooperative work.

  20. General purpose simulation system of the data management system for Space Shuttle mission 18

    NASA Technical Reports Server (NTRS)

    Bengtson, N. M.; Mellichamp, J. M.; Smith, O. C.

    1976-01-01

    A simulation program for the flow of data through the Data Management System of Spacelab and Space Shuttle was presented. The science, engineering, command and guidance, navigation and control data were included. The programming language used was General Purpose Simulation System V (OS). The science and engineering data flow was modeled from its origin at the experiments and subsystems to transmission from Space Shuttle. Command data flow was modeled from the point of reception onboard and from the CDMS Control Panel to the experiments and subsystems. The GN&C data flow model handled data between the General Purpose Computer and the experiments and subsystems. Mission 18 was the particular flight chosen for simulation. The general structure of the program is presented, followed by a user's manual. Input data required to make runs are discussed followed by identification of the output statistics. The appendices contain a detailed model configuration, program listing and results.

  1. Simulation and Visualization of Chaos in a Driven Nonlinear Pendulum -- An Aid to Introducing Chaotic Systems in Physics

    NASA Astrophysics Data System (ADS)

    Akpojotor, Godfrey; Ehwerhemuepha, Louis; Amromanoh, Ogheneriobororue

    2013-03-01

    The presence of physical systems whose characteristics change in a seemingly erratic manner gives rise to the study of chaotic systems. The characteristics of these systems are due to their hypersensitivity to changes in initial conditions. In order to understand chaotic systems, some sort of simulation and visualization is pertinent. Consequently, in this work, we have simulated and graphically visualized chaos in a driven nonlinear pendulum as a means of introducing chaotic systems. The results obtained which highlight the hypersensitivity of the pendulum are used to discuss the effectiveness of teaching and learning the physics of chaotic system using Python. This study is one of the many studies under the African Computational Science and Engineering Tour Project (PASET) which is using Python to model, simulate and visualize concepts, laws and phenomena in Science and Engineering to compliment the teaching/learning of theory and experiment.

  2. Realistic natural atmospheric phenomena and weather effects for interactive virtual environments

    NASA Astrophysics Data System (ADS)

    McLoughlin, Leigh

    Clouds and the weather are important aspects of any natural outdoor scene, but existing dynamic techniques within computer graphics only offer the simplest of cloud representations. The problem that this work looks to address is how to provide a means of simulating clouds and weather features such as precipitation, that are suitable for virtual environments. Techniques for cloud simulation are available within the area of meteorology, but numerical weather prediction systems are computationally expensive, give more numerical accuracy than we require for graphics and are restricted to the laws of physics. Within computer graphics, we often need to direct and adjust physical features or to bend reality to meet artistic goals, which is a key difference between the subjects of computer graphics and physical science. Pure physically-based simulations, however, evolve their solutions according to pre-set rules and are notoriously difficult to control. The challenge then is for the solution to be computationally lightweight and able to be directed in some measure while at the same time producing believable results. This work presents a lightweight physically-based cloud simulation scheme that simulates the dynamic properties of cloud formation and weather effects. The system simulates water vapour, cloud water, cloud ice, rain, snow and hail. The water model incorporates control parameters and the cloud model uses an arbitrary vertical temperature profile, with a tool described to allow the user to define this. The result of this work is that clouds can now be simulated in near real-time complete with precipitation. The temperature profile and tool then provide a means of directing the resulting formation..

  3. A Combined Experimental and Computational Study on the Stability of Nanofluids Containing Metal Organic Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annapureddy, Harsha Vardhan Reddy; Nune, Satish K.; Motkuri, Radha K.

    2015-01-08

    Computational studies on nanofluids composed of metal organic frameworks (MOFs) were performed using molecular modeling techniques. Grand Canonical Monte Carlo (GCMC) simulations were used to study adsorption behavior of 1,1,1,3,3-pentafluoropropane (R-245fa) in a MIL-101 MOF at various temperatures. To understand the stability of the nanofluid composed of MIL-101 particles, we performed molecular dynamics simulations to compute potentials of mean force between hypothetical MIL-101 fragments terminated with two different kinds of modulators in R-245fa and water. Our computed potentials of mean force results indicate that the MOF particles tend to disperse better in water than in R-245fa. The reasons for thismore » observation were analyzed and discussed. Our results agree with experimental results indicating that the employed potential models and modeling approaches provide good description of molecular interactions and the reliabilities. Work performed by LXD was supported by the U.S. Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. Work performed by HVRA, SKN, RKM, and PBM was supported by the Office of Energy Efficiency and Renewable Energy, Geothermal Technologies Program. Pacific Northwest National Laboratory is a multiprogram national laboratory operated for DOE by Battelle.« less

  4. Artificial Intelligence and Expert Systems.

    ERIC Educational Resources Information Center

    Lawlor, Joseph

    Artificial intelligence (AI) is the field of scientific inquiry concerned with designing machine systems that can simulate human mental processes. The field draws upon theoretical constructs from a wide variety of disciplines, including mathematics, psychology, linguistics, neurophysiology, computer science, and electronic engineering. Some of the…

  5. Trends in computer applications in science assessment

    NASA Astrophysics Data System (ADS)

    Kumar, David D.; Helgeson, Stanley L.

    1995-03-01

    Seven computer applications to science assessment are reviewed. Conventional test administration includes record keeping, grading, and managing test banks. Multiple-choice testing involves forced selection of an answer from a menu, whereas constructed-response testing involves options for students to present their answers within a set standard deviation. Adaptive testing attempts to individualize the test to minimize the number of items and time needed to assess a student's knowledge. Figurai response testing assesses science proficiency in pictorial or graphic mode and requires the student to construct a mental image rather than selecting a response from a multiple choice menu. Simulations have been found useful for performance assessment on a large-scale basis in part because they make it possible to independently specify different aspects of a real experiment. An emerging approach to performance assessment is solution pathway analysis, which permits the analysis of the steps a student takes in solving a problem. Virtually all computer-based testing systems improve the quality and efficiency of record keeping and data analysis.

  6. An accurate, compact and computationally efficient representation of orbitals for quantum Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Luo, Ye; Esler, Kenneth; Kent, Paul; Shulenburger, Luke

    Quantum Monte Carlo (QMC) calculations of giant molecules, surface and defect properties of solids have been feasible recently due to drastically expanding computational resources. However, with the most computationally efficient basis set, B-splines, these calculations are severely restricted by the memory capacity of compute nodes. The B-spline coefficients are shared on a node but not distributed among nodes, to ensure fast evaluation. A hybrid representation which incorporates atomic orbitals near the ions and B-spline ones in the interstitial regions offers a more accurate and less memory demanding description of the orbitals because they are naturally more atomic like near ions and much smoother in between, thus allowing coarser B-spline grids. We will demonstrate the advantage of hybrid representation over pure B-spline and Gaussian basis sets and also show significant speed-up like computing the non-local pseudopotentials with our new scheme. Moreover, we discuss a new algorithm for atomic orbital initialization which used to require an extra workflow step taking a few days. With this work, the highly efficient hybrid representation paves the way to simulate large size even in-homogeneous systems using QMC. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Computational Materials Sciences Program.

  7. Applying ``intelligent`` materials for materials education: The Labless Lab{trademark}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrade, J.D.; Scheer, R.

    1994-12-31

    A very large number of science and engineering courses taught in colleges and universities today do not involve laboratories. Although good instructors incorporate class demonstrations, hands on homework, and various teaching aids, including computer simulations, the fact is that students in such courses often accept key concepts and experimental results without discovering them for themselves. The only partial solution to this problem has been increasing use of class demonstrations and computer simulations. The authors feel strongly that many complex concepts can be observed and assimilated through experimentation with properly designed materials. They propose the development of materials and specimens designedmore » specifically for education purposes. Intelligent and communicative materials are ideal for this purpose. Specimens which respond in an observable fashion to new environments and situations provided by the students/experimenter provide a far more effective materials science and engineering experience than readouts and data generated by complex and expensive machines, particularly in an introductory course. Modern materials can be designed to literally communicate with the observer. The authors embarked on a project to develop a series of Labless Labs{trademark} utilizing various degrees and levels of intelligence in materials. It is expected that such Labless Labs{trademark} would be complementary to textbooks and computer simulations and to be used to provide a reality for students in courses and other learning situations where access to a laboratory is non-existent or limited.« less

  8. Promoting Interests in Atmospheric Science at a Liberal Arts Institution

    NASA Astrophysics Data System (ADS)

    Roussev, S.; Sherengos, P. M.; Limpasuvan, V.; Xue, M.

    2007-12-01

    Coastal Carolina University (CCU) students in Computer Science participated in a project to set up an operational weather forecast for the local community. The project involved the construction of two computing clusters and the automation of daily forecasting. Funded by NSF-MRI, two high-performance clusters were successfully established to run the University of Oklahoma's Advance Regional Prediction System (ARPS). Daily weather predictions are made over South Carolina and North Carolina at 3-km horizontal resolution (roughly 1.9 miles) using initial and boundary condition data provided by UNIDATA. At this high resolution, the model is cloud- resolving, thus providing detailed picture of heavy thunderstorms and precipitation. Forecast results are displayed on CCU's website (https://marc.coastal.edu/HPC) to complement observations at the National Weather Service in Wilmington N.C. Present efforts include providing forecasts at 1-km resolution (or finer), comparisons with other models like Weather Research and Forecasting (WRF) model, and the examination of local phenomena (like water spouts and tornadoes). Through these activities the students learn about shell scripting, cluster operating systems, and web design. More importantly, students are introduced to Atmospheric Science, the processes involved in making weather forecasts, and the interpretation of their forecasts. Simulations generated by the forecasts will be integrated into the contents of CCU's course like Fluid Dynamics, Atmospheric Sciences, Atmospheric Physics, and Remote Sensing. Operated jointly between the departments of Applied Physics and Computer Science, the clusters are expected to be used by CCU faculty and students for future research and inquiry-based projects in Computer Science, Applied Physics, and Marine Science.

  9. The Importance of Simulation Workflow and Data Management in the Accelerated Climate Modeling for Energy Project

    NASA Astrophysics Data System (ADS)

    Bader, D. C.

    2015-12-01

    The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.

  10. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels ofmore » parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.« less

  11. A synthetic design environment for ship design

    NASA Technical Reports Server (NTRS)

    Chipman, Richard R.

    1995-01-01

    Rapid advances in computer science and information system technology have made possible the creation of synthetic design environments (SDE) which use virtual prototypes to increase the efficiency and agility of the design process. This next generation of computer-based design tools will rely heavily on simulation and advanced visualization techniques to enable integrated product and process teams to concurrently conceptualize, design, and test a product and its fabrication processes. This paper summarizes a successful demonstration of the feasibility of using a simulation based design environment in the shipbuilding industry. As computer science and information science technologies have evolved, there have been many attempts to apply and integrate the new capabilities into systems for the improvement of the process of design. We see the benefits of those efforts in the abundance of highly reliable, technologically complex products and services in the modern marketplace. Furthermore, the computer-based technologies have been so cost effective that the improvements embodied in modern products have been accompanied by lowered costs. Today the state-of-the-art in computerized design has advanced so dramatically that the focus is no longer on merely improving design methodology; rather the goal is to revolutionize the entire process by which complex products are conceived, designed, fabricated, tested, deployed, operated, maintained, refurbished and eventually decommissioned. By concurrently addressing all life-cycle issues, the basic decision making process within an enterprise will be improved dramatically, leading to new levels of quality, innovation, efficiency, and customer responsiveness. By integrating functions and people with an enterprise, such systems will change the fundamental way American industries are organized, creating companies that are more competitive, creative, and productive.

  12. Electrostatic solvation free energies of charged hard spheres using molecular dynamics with density functional theory interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duignan, Timothy T.; Baer, Marcel D.; Schenter, Gregory K.

    Determining the solvation free energies of single ions in water is one of the most fundamental problems in physical chemistry and yet many unresolved questions remain. In particular, the ability to decompose the solvation free energy into simple and intuitive contributions will have important implications for coarse grained models of electrolyte solution. Here, we provide rigorous definitions of the various types of single ion solvation free energies based on different simulation protocols. We calculate solvation free energies of charged hard spheres using density functional theory interaction potentials with molecular dynamics simulation (DFT-MD) and isolate the effects of charge and cavitation,more » comparing to the Born (linear response) model. We show that using uncorrected Ewald summation leads to highly unphysical values for the solvation free energy and that charging free energies for cations are approximately linear as a function of charge but that there is a small non-linearity for small anions. The charge hydration asymmetry (CHA) for hard spheres, determined with quantum mechanics, is much larger than for the analogous real ions. This suggests that real ions, particularly anions, are significantly more complex than simple charged hard spheres, a commonly employed representation. We would like to thank Thomas Beck, Shawn Kathmann, Richard Remsing and John Weeks for helpful discussions. Computing resources were generously allocated by PNNL's Institutional Computing program. This research also used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. TTD, GKS, and CJM were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). PNNL is a multi-program national laboratory operated by Battelle for the U.S. Department of Energy.« less

  13. The effects of computer-based dynamic visualization simulations on student learning in high school science

    NASA Astrophysics Data System (ADS)

    Moodley, Sadha

    The purpose of this study was to determine whether the use of dynamic computer-based visualizations of the classical model of particle behavior helps to improve student understanding, performance, and interest in science when used by teachers as visual presentations to complement their traditional methods of teaching. The software, Virtual Molecular Dynamics Laboratory (VMDL), was developed at the Center for Polymer Studies at Boston University through funding from the National Science Foundation. The design of the study included five pairs of classes in four different schools in New England from the inner city and from advantaged suburbs. The study employed a treatment-control group design for testing the impact of several VMDL simulations on student learning in several content areas from traditional chemistry and physical science courses. The study employed a mixed qualitative and quantitative design. The quantitative part involved administering the Group Assessment of Logical Thinking (GALT) as well as post-tests that were topic specific. An Analysis of Covariance (ANCOVA) was conducted on the test scores with the GALT scores serving as a covariate. Results of the ANCOVA showed that students' understanding and performance were better in classes where teachers used the computer-based dynamic visualizations to complement their traditional teaching. GALT scores were significantly different among schools but very similar within schools. They were significant in adjusting post-test scores for pre-treatment differences for only two of the schools. The treatment groups outscored the control groups in all five comparisons. The mean differences reached statistical significance at the p < .01 level in only four of the comparisons. The qualitative part of the study involved classroom observations and student interviews. Analysis of classroom observations revealed a shift in classroom dynamics to more learner-centeredness with greater engagement by students, especially in classes that tended to have little student participation without the simulations. Analysis of the student interviews indicated that the dynamic visualizations made learning more enjoyable, helped with remembering, and enhanced students abilities to make connections between the nanoscopic and macroscopic science.

  14. Analysis of a Proposed Material Handling System Using a Computer Simulation Model.

    DTIC Science & Technology

    1981-06-01

    the proposed MMHS were identified to assist the managers of the system in implementation and future planning. * 4 UNCLASSIFIED SRCUllTY CLASSIPICATION...the Degree of Master of Science in Logistics Management By Darwin D. Harp, BSIE GS-11. June 1981 Approved for public release; distribution unlimited...partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN LOGISTICS MANAGEMENT DATE: 17 June 1981 (( COMMITECARN ii 67- B I

  15. Computational Infrastructure for Geodynamics (CIG)

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.

    2004-12-01

    Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.

  16. Modeling Subsurface Reactive Flows Using Leadership-Class Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Richard T; Hammond, Glenn; Lichtner, Peter

    2009-01-01

    We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.

  17. A Novel Approach to Develop the Lower Order Model of Multi-Input Multi-Output System

    NASA Astrophysics Data System (ADS)

    Rajalakshmy, P.; Dharmalingam, S.; Jayakumar, J.

    2017-10-01

    A mathematical model is a virtual entity that uses mathematical language to describe the behavior of a system. Mathematical models are used particularly in the natural sciences and engineering disciplines like physics, biology, and electrical engineering as well as in the social sciences like economics, sociology and political science. Physicists, Engineers, Computer scientists, and Economists use mathematical models most extensively. With the advent of high performance processors and advanced mathematical computations, it is possible to develop high performing simulators for complicated Multi Input Multi Ouptut (MIMO) systems like Quadruple tank systems, Aircrafts, Boilers etc. This paper presents the development of the mathematical model of a 500 MW utility boiler which is a highly complex system. A synergistic combination of operational experience, system identification and lower order modeling philosophy has been effectively used to develop a simplified but accurate model of a circulation system of a utility boiler which is a MIMO system. The results obtained are found to be in good agreement with the physics of the process and with the results obtained through design procedure. The model obtained can be directly used for control system studies and to realize hardware simulators for boiler testing and operator training.

  18. GRAPE project

    NASA Astrophysics Data System (ADS)

    Makino, Junichiro

    2002-12-01

    We overview our GRAvity PipE (GRAPE) project to develop special-purpose computers for astrophysical N-body simulations. The basic idea of GRAPE is to attach a custom-build computer dedicated to the calculation of gravitational interaction between particles to a general-purpose programmable computer. By this hybrid architecture, we can achieve both a wide range of applications and very high peak performance. Our newest machine, GRAPE-6, achieved the peak speed of 32 Tflops, and sustained performance of 11.55 Tflops, for the total budget of about 4 million USD. We also discuss relative advantages of special-purpose and general-purpose computers and the future of high-performance computing for science and technology.

  19. Electronic Circuit Analysis Language (ECAL)

    NASA Astrophysics Data System (ADS)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  20. Quantum Simulation of Helium Hydride Cation in a Solid-State Spin Register.

    PubMed

    Wang, Ya; Dolde, Florian; Biamonte, Jacob; Babbush, Ryan; Bergholm, Ville; Yang, Sen; Jakobi, Ingmar; Neumann, Philipp; Aspuru-Guzik, Alán; Whitfield, James D; Wrachtrup, Jörg

    2015-08-25

    Ab initio computation of molecular properties is one of the most promising applications of quantum computing. While this problem is widely believed to be intractable for classical computers, efficient quantum algorithms exist which have the potential to vastly accelerate research throughput in fields ranging from material science to drug discovery. Using a solid-state quantum register realized in a nitrogen-vacancy (NV) defect in diamond, we compute the bond dissociation curve of the minimal basis helium hydride cation, HeH(+). Moreover, we report an energy uncertainty (given our model basis) of the order of 10(-14) hartree, which is 10 orders of magnitude below the desired chemical precision. As NV centers in diamond provide a robust and straightforward platform for quantum information processing, our work provides an important step toward a fully scalable solid-state implementation of a quantum chemistry simulator.

  1. An immersed boundary method for modeling a dirty geometry data

    NASA Astrophysics Data System (ADS)

    Onishi, Keiji; Tsubokura, Makoto

    2017-11-01

    We present a robust, fast, and low preparation cost immersed boundary method (IBM) for simulating an incompressible high Re flow around highly complex geometries. The method is achieved by the dispersion of the momentum by the axial linear projection and the approximate domain assumption satisfying the mass conservation around the wall including cells. This methodology has been verified against an analytical theory and wind tunnel experiment data. Next, we simulate the problem of flow around a rotating object and demonstrate the ability of this methodology to the moving geometry problem. This methodology provides the possibility as a method for obtaining a quick solution at a next large scale supercomputer. This research was supported by MEXT as ``Priority Issue on Post-K computer'' (Development of innovative design and production processes) and used computational resources of the K computer provided by the RIKEN Advanced Institute for Computational Science.

  2. Preface: SciDAC 2005

    NASA Astrophysics Data System (ADS)

    Mezzacappa, Anthony

    2005-01-01

    On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.

  3. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    NASA Astrophysics Data System (ADS)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We believe that the lessons learned and tools developed will be useful in many areas of science beyond the computational mineralogy. Key tools that will be described include: a pure Fortran XML library (FoX) that presents XPath, SAX and DOM interfaces as well as permitting the easy production of valid XML from legacy Fortran programs; a job submission framework that automatically schedules calculations to remote grid resources, handles data staging and metadata capture; and a tool (AgentX) that map concepts from an ontology onto locations in documents of various formats that we use to enable data exchange.

  4. Chaos: A Topic for Interdisciplinary Education in Physics

    ERIC Educational Resources Information Center

    Bae, Saebyok

    2009-01-01

    Since society and science need interdisciplinary works, the interesting topic of chaos is chosen for interdisciplinary education in physics. The educational programme contains various university-level activities such as computer simulations, chaos experiment and team projects besides ordinary teaching. According to the participants, the programme…

  5. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  6. Feature Statistics Modulate the Activation of Meaning During Spoken Word Processing.

    PubMed

    Devereux, Barry J; Taylor, Kirsten I; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K

    2016-03-01

    Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in (distinctiveness/sharedness) and likelihood of co-occurrence (correlational strength)--determine conceptual activation. To test these claims, we investigated the role of distinctiveness/sharedness and correlational strength in speech-to-meaning mapping, using a lexical decision task and computational simulations. Responses were faster for concepts with higher sharedness, suggesting that shared features are facilitatory in tasks like lexical decision that require access to them. Correlational strength facilitated responses for slower participants, suggesting a time-sensitive co-occurrence-driven settling mechanism. The computational simulation showed similar effects, with early effects of shared features and later effects of correlational strength. These results support a general-to-specific account of conceptual processing, whereby early activation of shared features is followed by the gradual emergence of a specific target representation. Copyright © 2015 The Authors. Cognitive Science published by Cognitive Science Society, Inc.

  7. Computational modeling of pitching cylinder-type ocean wave energy converters using 3D MPI-parallel simulations

    NASA Astrophysics Data System (ADS)

    Freniere, Cole; Pathak, Ashish; Raessi, Mehdi

    2016-11-01

    Ocean Wave Energy Converters (WECs) are devices that convert energy from ocean waves into electricity. To aid in the design of WECs, an advanced computational framework has been developed which has advantages over conventional methods. The computational framework simulates the performance of WECs in a virtual wave tank by solving the full Navier-Stokes equations in 3D, capturing the fluid-structure interaction, nonlinear and viscous effects. In this work, we present simulations of the performance of pitching cylinder-type WECs and compare against experimental data. WECs are simulated at both model and full scales. The results are used to determine the role of the Keulegan-Carpenter (KC) number. The KC number is representative of viscous drag behavior on a bluff body in an oscillating flow, and is considered an important indicator of the dynamics of a WEC. Studying the effects of the KC number is important for determining the validity of the Froude scaling and the inviscid potential flow theory, which are heavily relied on in the conventional approaches to modeling WECs. Support from the National Science Foundation is gratefully acknowledged.

  8. Polymerization and Structure of Bio-Based Plastics: A Computer Simulation

    NASA Astrophysics Data System (ADS)

    Khot, Shrikant N.; Wool, Richard P.

    2001-03-01

    We recently examined several hundred chemical pathways to convert chemically functionalized plant oil triglycerides, monoglycerides and reactive diluents into high performance plastics with a broad range of properties (US Patent No. 6,121,398). The resulting polymers had linear, branched, light- and highly-crosslinked chain architectures and could be used as pressure sensitive adhesives, elastomers and high performance rigid thermoset composite resins. To optimize the molecular design and minimize the number of chemical trials in this system with excess degrees of freedom, we developed a computer simulation of the free radical polymerization process. The triglyceride structure, degree of chemical substitution, mole fractions, fatty acid distribution function, and reaction kinetic parameters were used as initial inputs on a 3d lattice simulation. The evolution of the network fractal structure was computed and used to measure crosslink density, dangling ends, degree of reaction and defects in the lattice. The molecular connectivity was used to determine strength via a vector percolation model of fracture. The simulation permitted the optimal design of new bio-based materials with respect to monomer selection, cure reaction conditions and desired properties. Supported by the National Science Foundation

  9. High Energy Physics Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and High Energy Physics, June 10-12, 2015, Bethesda, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Roser, Robert; Gerber, Richard

    The U.S. Department of Energy (DOE) Office of Science (SC) Offices of High Energy Physics (HEP) and Advanced Scientific Computing Research (ASCR) convened a programmatic Exascale Requirements Review on June 10–12, 2015, in Bethesda, Maryland. This report summarizes the findings, results, and recommendations derived from that meeting. The high-level findings and observations are as follows. Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude — and in some cases greatermore » — than that available currently. The growth rate of data produced by simulations is overwhelming the current ability of both facilities and researchers to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. Data rates and volumes from experimental facilities are also straining the current HEP infrastructure in its ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. A close integration of high-performance computing (HPC) simulation and data analysis will greatly aid in interpreting the results of HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. Long-range planning between HEP and ASCR will be required to meet HEP’s research needs. To best use ASCR HPC resources, the experimental HEP program needs (1) an established, long-term plan for access to ASCR computational and data resources, (2) the ability to map workflows to HPC resources, (3) the ability for ASCR facilities to accommodate workflows run by collaborations potentially comprising thousands of individual members, (4) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, (5) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less

  10. A Test of the Validity of Inviscid Wall-Modeled LES

    NASA Astrophysics Data System (ADS)

    Redman, Andrew; Craft, Kyle; Aikens, Kurt

    2015-11-01

    Computational expense is one of the main deterrents to more widespread use of large eddy simulations (LES). As such, it is important to reduce computational costs whenever possible. In this vein, it may be reasonable to assume that high Reynolds number flows with turbulent boundary layers are inviscid when using a wall model. This assumption relies on the grid being too coarse to resolve either the viscous length scales in the outer flow or those near walls. We are not aware of other studies that have suggested or examined the validity of this approach. The inviscid wall-modeled LES assumption is tested here for supersonic flow over a flat plate on three different grids. Inviscid and viscous results are compared to those of another wall-modeled LES as well as experimental data - the results appear promising. Furthermore, the inviscid assumption reduces simulation costs by about 25% and 39% for supersonic and subsonic flows, respectively, with the current LES application. Recommendations are presented as are future areas of research. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  11. Density functional theory in the solid state

    PubMed Central

    Hasnip, Philip J.; Refson, Keith; Probert, Matt I. J.; Yates, Jonathan R.; Clark, Stewart J.; Pickard, Chris J.

    2014-01-01

    Density functional theory (DFT) has been used in many fields of the physical sciences, but none so successfully as in the solid state. From its origins in condensed matter physics, it has expanded into materials science, high-pressure physics and mineralogy, solid-state chemistry and more, powering entire computational subdisciplines. Modern DFT simulation codes can calculate a vast range of structural, chemical, optical, spectroscopic, elastic, vibrational and thermodynamic phenomena. The ability to predict structure–property relationships has revolutionized experimental fields, such as vibrational and solid-state NMR spectroscopy, where it is the primary method to analyse and interpret experimental spectra. In semiconductor physics, great progress has been made in the electronic structure of bulk and defect states despite the severe challenges presented by the description of excited states. Studies are no longer restricted to known crystallographic structures. DFT is increasingly used as an exploratory tool for materials discovery and computational experiments, culminating in ex nihilo crystal structure prediction, which addresses the long-standing difficult problem of how to predict crystal structure polymorphs from nothing but a specified chemical composition. We present an overview of the capabilities of solid-state DFT simulations in all of these topics, illustrated with recent examples using the CASTEP computer program. PMID:24516184

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bland, Arthur S Buddy; Hack, James J; Baker, Ann E

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next-generation systems.« less

  13. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  14. Elementary and Advanced Computer Projects for the Physics Classroom and Laboratory

    DTIC Science & Technology

    1992-12-01

    are SPF/PC, MS Word, n3, Symphony, Mathematics, and FORTRAN. The authors’ programs assist data analysis in particular laboratory experiments and make...assist data analysis in particular laboratory experiments and make use of the Monte Carlo and other numerical techniques in computer simulation and...the language of science and engineering in industry and government laboratories (alth..4h C is becoming a powerful competitor ). RM/FORTRAN (cost $400

  15. Parallel simulations of Grover's algorithm for closest match search in neutron monitor data

    NASA Astrophysics Data System (ADS)

    Kussainov, Arman; White, Yelena

    We are studying the parallel implementations of Grover's closest match search algorithm for neutron monitor data analysis. This includes data formatting, and matching quantum parameters to a conventional structure of a chosen programming language and selected experimental data type. We have employed several workload distribution models based on acquired data and search parameters. As a result of these simulations, we have an understanding of potential problems that may arise during configuration of real quantum computational devices and the way they could run tasks in parallel. The work was supported by the Science Committee of the Ministry of Science and Education of the Republic of Kazakhstan Grant #2532/GF3.

  16. Application of the Fractions Skill Score for Tracking the Effectiveness of Improvements Made to Weather Research and Forecasting Model Simulations

    DTIC Science & Technology

    2017-11-22

    Weather Research and Forecasting Model Simulations by John W Raby and Huaqing Cai Computational and Information Sciences Directorate, ARL...burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this

  17. C3Conflict a Simulation Environment for Studying Teamwork in Command and Control

    DTIC Science & Technology

    2011-06-01

    the Sciences (pp. 173- 217). Amsterdam/New York: Rodopi. Kolb , D. A. (1984). Experiential Learning – Experience as a source of learning and...increases dramatically when the students can see a replay and discuss their collaboration. Kolb has expressed a generally accepted model of experiential ... learning ( Kolb , 1998). The model can be adapted for research and team training performed with computer-based simulations (Granlund, 2008). The main

  18. The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)

    NASA Astrophysics Data System (ADS)

    2017-09-01

    The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less

  20. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucas, Robert; Ang, James; Bergman, Keren

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less

  1. The 3d International Workshop on Computational Electronics

    NASA Astrophysics Data System (ADS)

    Goodnick, Stephen M.

    1994-09-01

    The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.

  2. Software for pest-management science: computer models and databases from the United States Department of Agriculture-Agricultural Research Service.

    PubMed

    Wauchope, R Don; Ahuja, Lajpat R; Arnold, Jeffrey G; Bingner, Ron; Lowrance, Richard; van Genuchten, Martinus T; Adams, Larry D

    2003-01-01

    We present an overview of USDA Agricultural Research Service (ARS) computer models and databases related to pest-management science, emphasizing current developments in environmental risk assessment and management simulation models. The ARS has a unique national interdisciplinary team of researchers in surface and sub-surface hydrology, soil and plant science, systems analysis and pesticide science, who have networked to develop empirical and mechanistic computer models describing the behavior of pests, pest responses to controls and the environmental impact of pest-control methods. Historically, much of this work has been in support of production agriculture and in support of the conservation programs of our 'action agency' sister, the Natural Resources Conservation Service (formerly the Soil Conservation Service). Because we are a public agency, our software/database products are generally offered without cost, unless they are developed in cooperation with a private-sector cooperator. Because ARS is a basic and applied research organization, with development of new science as our highest priority, these products tend to be offered on an 'as-is' basis with limited user support except for cooperating R&D relationship with other scientists. However, rapid changes in the technology for information analysis and communication continually challenge our way of doing business.

  3. Why cognitive science needs philosophy and vice versa.

    PubMed

    Thagard, Paul

    2009-04-01

    Contrary to common views that philosophy is extraneous to cognitive science, this paper argues that philosophy has a crucial role to play in cognitive science with respect to generality and normativity. General questions include the nature of theories and explanations, the role of computer simulation in cognitive theorizing, and the relations among the different fields of cognitive science. Normative questions include whether human thinking should be Bayesian, whether decision making should maximize expected utility, and how norms should be established. These kinds of general and normative questions make philosophical reflection an important part of progress in cognitive science. Philosophy operates best, however, not with a priori reasoning or conceptual analysis, but rather with empirically informed reflection on a wide range of findings in cognitive science. Copyright © 2009 Cognitive Science Society, Inc.

  4. Study of the Imaging Capabilities of SPIRIT/SPECS Concept Interferometers

    NASA Technical Reports Server (NTRS)

    Allen, Ronald J.

    2002-01-01

    Several new space science mission concepts under development at NASA-GSFC for astronomy are intended to carry out synthetic imaging using Michelson interferometers or direct (Fizeau) imaging with sparse apertures. Examples of these mission concepts include the Stellar Imager (SI), the Space Infrared Interferometric Telescope (SPIRIT), the Submillimeter Probe of the Evolution of Cosmic Structure (SPECS), and the Fourier-Kelvin Stellar Interferometer (FKSI). We have been developing computer-based simulators for these missions. These simulators are aimed at providing a quantitative evaluation of the imaging capabilities of the mission by modeling the performance on different realistic targets in terms of sensitivity, angular resolution, and dynamic range. Both Fizeau and Michelson modes of operation can be considered. Our work is based on adapting a computer simulator called imSIM which was initially written for the Space Interferometer Mission in order to simulate the imaging mode of new missions such as those listed. This report covers the activities we have undertaken to provide a preliminary version of a simulator for the SPIRIT mission concept.

  5. An investigation of the use of microcomputer-based laboratory simulations in promoting conceptual understanding in secondary physics instruction

    NASA Astrophysics Data System (ADS)

    Tomshaw, Stephen G.

    Physics education research has shown that students bring alternate conceptions to the classroom which can be quite resistant to traditional instruction methods (Clement, 1982; Halloun & Hestenes, 1985; McDermott, 1991). Microcomputer-based laboratory (MBL) experiments that employ an active-engagement strategy have been shown to improve student conceptual understanding in high school and introductory university physics courses (Thornton & Sokoloff, 1998). These (MBL) experiments require a specialized computer interface, type-specific sensors (e.g. motion detectors, force probes, accelerometers), and specialized software in addition to the standard physics experimental apparatus. Tao and Gunstone (1997) have shown that computer simulations used in an active engagement environment can also lead to conceptual change. This study investigated 69 secondary physics students' use of computer simulations of MBL activities in place of the hands-on MBL laboratory activities. The average normalized gain in students' conceptual understanding was measured using the Force and Motion Conceptual Evaluation (FMCE). Student attitudes towards physics and computers were probed using the Views About Science Survey (VASS) and the Computer Attitude Scale (CAS). While it may be possible to obtain an equivalent level of conceptual understanding using computer simulations in combination with an active-engagement environment, this study found no significant gains in students' conceptual understanding ( = -0.02) after they completed a series of nine simulated experiments from the Tools for Scientific Thinking curriculum (Thornton & Sokoloff, 1990). The absence of gains in conceptual understanding may indicate that either the simulations were ineffective in promoting conceptual change or problems with the implementation of the treatment inhibited its effectiveness. There was a positive shift in students' attitudes towards physics in the VASS dimensions of structure and reflective thinking, while there was a negative shift in students' attitudes towards computers in the CAS subscales of anxiety and usefulness. The negative shift in attitudes towards computers may be due to the additional time and work required by the students to perform the simulation experiments with no apparent reward in terms of their physics grade. Suggestions for future research include a qualitative element to observe student interactions and alternate formats for the simulations themselves.

  6. The Structure and Properties of Silica Glass Nanostructures using Novel Computational Systems

    NASA Astrophysics Data System (ADS)

    Doblack, Benjamin N.

    The structure and properties of silica glass nanostructures are examined using computational methods in this work. Standard synthesis methods of silica and its associated material properties are first discussed in brief. A review of prior experiments on this amorphous material is also presented. Background and methodology for the simulation of mechanical tests on amorphous bulk silica and nanostructures are later presented. A new computational system for the accurate and fast simulation of silica glass is also presented, using an appropriate interatomic potential for this material within the open-source molecular dynamics computer program LAMMPS. This alternative computational method uses modern graphics processors, Nvidia CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model select materials, this enhancement allows the addition of accelerated molecular dynamics simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal of this project is to investigate the structure and size dependent mechanical properties of silica glass nanohelical structures under tensile MD conditions using the innovative computational system. Specifically, silica nanoribbons and nanosprings are evaluated which revealed unique size dependent elastic moduli when compared to the bulk material. For the nanoribbons, the tensile behavior differed widely between the models simulated, with distinct characteristic extended elastic regions. In the case of the nanosprings simulated, more clear trends are observed. In particular, larger nanospring wire cross-sectional radii (r) lead to larger Young's moduli, while larger helical diameters (2R) resulted in smaller Young's moduli. Structural transformations and theoretical models are also analyzed to identify possible factors which might affect the mechanical response of silica nanostructures under tension. The work presented outlines an innovative simulation methodology, and discusses how results can be validated against prior experimental and simulation findings. The ultimate goal is to develop new computational methods for the study of nanostructures which will make the field of materials science more accessible, cost effective and efficient.

  7. Downscaling seasonal to centennial simulations on distributed computing infrastructures using WRF model. The WRF4G project

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.

    2013-12-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models, or seasonal. WRF4G is been used to run WRF simulations which are contributing to the CORDEX initiative and others projects like SPECS and EUPORIAS. This work is been partially funded by the European Regional Development Fund (ERDF) and the Spanish National R&D Plan 2008-2011 (CGL2011-28864)

  8. Evaluation of the Community Multiscale Air Quality (CMAQ) Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  9. Evaluation of the Community Multi-scale Air Quality Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  10. Science Education: An Experiment in Facilitating the Learning of Neurophysiology.

    ERIC Educational Resources Information Center

    Levitan, Herbert

    1981-01-01

    Summarizes the experiences of a zoology professor attempting to construct a student-centered course in neurophysiology. Various aspects of the organization and conduct of the course are described, including the beginning experience, topics of interest, lecture, laboratory, computer simulation, examinations, student lectures. Evaluation of the…

  11. What We've Learned about Assessing Hands-On Science.

    ERIC Educational Resources Information Center

    Shavelson, Richard J.; Baxter, Gail P.

    1992-01-01

    A recent study compared hands-on scientific inquiry assessment to assessments involving lab notebooks, computer simulations, short-answer paper-and-pencil problems, and multiple-choice questions. Creating high quality performance assessments is a costly, time-consuming process requiring considerable scientific and technological know-how. Improved…

  12. Fast ray-tracing of human eye optics on Graphics Processing Units.

    PubMed

    Wei, Qi; Patkar, Saket; Pai, Dinesh K

    2014-05-01

    We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Shock Interaction with Random Spherical Particle Beds

    NASA Astrophysics Data System (ADS)

    Neal, Chris; Mehta, Yash; Salari, Kambiz; Jackson, Thomas L.; Balachandar, S. "Bala"; Thakur, Siddharth

    2016-11-01

    In this talk we present results on fully resolved simulations of shock interaction with randomly distributed bed of particles. Multiple simulations were carried out by varying the number of particles to isolate the effect of volume fraction. Major focus of these simulations was to understand 1) the effect of the shockwave and volume fraction on the forces experienced by the particles, 2) the effect of particles on the shock wave, and 3) fluid mediated particle-particle interactions. Peak drag force for particles at different volume fractions show a downward trend as the depth of the bed increased. This can be attributed to dissipation of energy as the shockwave travels through the bed of particles. One of the fascinating observations from these simulations was the fluctuations in different quantities due to presence of multiple particles and their random distribution. These are large simulations with hundreds of particles resulting in large amount of data. We present statistical analysis of the data and make relevant observations. Average pressure in the computational domain is computed to characterize the strengths of the reflected and transmitted waves. We also present flow field contour plots to support our observations. U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  14. Laboratory Directed Research and Development FY2011 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, W; Sketchley, J; Kotta, P

    2012-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser Inertial-Fusion Energy; (12) Advanced Laser Optical Systems and Applications; (12) Space Security; (13) Stockpile Stewardship Science; (14) National Security; (15) Alternative Energy; and (16) Climatic Change.« less

  15. Practical Unitary Simulator for Non-Markovian Complex Processes

    NASA Astrophysics Data System (ADS)

    Binder, Felix C.; Thompson, Jayne; Gu, Mile

    2018-06-01

    Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.

  16. Extended Lagrangian Density Functional Tight-Binding Molecular Dynamics for Molecules and Solids.

    PubMed

    Aradi, Bálint; Niklasson, Anders M N; Frauenheim, Thomas

    2015-07-14

    A computationally fast quantum mechanical molecular dynamics scheme using an extended Lagrangian density functional tight-binding formulation has been developed and implemented in the DFTB+ electronic structure program package for simulations of solids and molecular systems. The scheme combines the computational speed of self-consistent density functional tight-binding theory with the efficiency and long-term accuracy of extended Lagrangian Born-Oppenheimer molecular dynamics. For systems without self-consistent charge instabilities, only a single diagonalization or construction of the single-particle density matrix is required in each time step. The molecular dynamics simulation scheme can be applied to a broad range of problems in materials science, chemistry, and biology.

  17. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.

  18. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  19. Principles for Integrating Mars Analog Science, Operations, and Technology Research

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, the scientific community and NASA used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. Human factors studies (Harrison, Clearwater, & McKay 1991; Stuster 1996) have focused on the effects of isolation in extreme environments. More recently, with the advent of wireless computing, we have prototyped advanced EVA technologies for navigation, scheduling, and science data logging (Clancey 2002b; Clancey et al., in press). Combining these interests in a single expedition enables tremendous synergy and authenticity, as pioneered by Pascal Lee's Haughton-Mars Project (Lee 2001; Clancey 2000a) and the Mars Society s research stations on a crater rim on Devon Island in the High Canadian Arctic (Clancey 2000b; 2001b) and the Morrison Formation of southeast Utah (Clancey 2002a). Based on this experience, the following principles are proposed for conducting an integrated science, operations, and technology research program at analog sites: 1) Authentic work; 2) PI-based projects; 3) Unencumbered baseline studies; 4) Closed simulations; and 5) Observation and documentation. Following these principles, we have been integrating field science, operations research, and technology development at analog sites on Devon Island and in Utah over the past five years. Analytic methods include work practice simulation (Clancey 2002c; Sierhuis et a]., 2000a;b), by which the interaction of human behavior, facilities, geography, tools, and procedures are formalized in computer models. These models are then converted into the runtime EVA system we call mobile agents (Clancey 2002b; Clancey et al., in press). Furthermore, we have found that the Apollo Lunar Surface Journal (Jones, 1999) provides a vast repository or understanding astronaut and CapCom interactions, serving as a baseline for Mars operations and quickly highlighting opportunities for computer automation (Clancey, in press).

  20. CFD - Mature Technology?

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2005-01-01

    Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.

  1. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  2. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  3. Efficient Variational Quantum Simulator Incorporating Active Error Minimization

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2017-04-01

    One of the key applications for quantum computers will be the simulation of other quantum systems that arise in chemistry, materials science, etc., in order to accelerate the process of discovery. It is important to ask the following question: Can this simulation be achieved using near-future quantum processors, of modest size and under imperfect control, or must it await the more distant era of large-scale fault-tolerant quantum computing? Here, we propose a variational method involving closely integrated classical and quantum coprocessors. We presume that all operations in the quantum coprocessor are prone to error. The impact of such errors is minimized by boosting them artificially and then extrapolating to the zero-error case. In comparison to a more conventional optimized Trotterization technique, we find that our protocol is efficient and appears to be fundamentally more robust against error accumulation.

  4. The Computational Science Education Reference Desk: A tool for increasing inquiry based learning in the science classroom

    NASA Astrophysics Data System (ADS)

    Joiner, D. A.; Stevenson, D. E.; Panoff, R. M.

    2000-12-01

    The Computational Science Reference Desk is an online tool designed to provide educators in math, physics, astronomy, biology, chemistry, and engineering with information on how to use computational science to enhance inquiry based learning in the undergraduate and pre college classroom. The Reference Desk features a showcase of original content exploration activities, including lesson plans and background materials; a catalog of websites which contain models, lesson plans, software, and instructional resources; and a forum to allow educators to communicate their ideas. Many of the recent advances in astronomy rely on the use of computer simulation, and tools are being developed by CSERD to allow students to experiment with some of the models that have guided scientific discovery. One of these models allows students to study how scientists use spectral information to determine the makeup of the interstellar medium by modeling the interstellar extinction curve using spherical grains of silicate, amorphous carbon, or graphite. Students can directly compare their model to the average interstellar extinction curve, and experiment with how small changes in their model alter the shape of the interstellar extinction curve. A simpler model allows students to visualize spatial relationships between the Earth, Moon, and Sun to understand the cause of the phases of the moon. A report on the usefulness of these models in two classes, the Computational Astrophysics workshop at The Shodor Education Foundation and the Conceptual Astronomy class at the University of North Carolina at Greensboro, will be presented.

  5. Computational simulations of the interaction of water waves with pitching flap-type ocean wave energy converters

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Raessi, Mehdi

    2016-11-01

    Using an in-house computational framework, we have studied the interaction of water waves with pitching flap-type ocean wave energy converters (WECs). The computational framework solves the full 3D Navier-Stokes equations and captures important effects, including the fluid-solid interaction, the nonlinear and viscous effects. The results of the computational tool, is first compared against the experimental data on the response of a flap-type WEC in a wave tank, and excellent agreement is demonstrated. Further simulations at the model and prototype scales are presented to assess the validity of the Froude scaling. The simulations are used to address some important questions, such as the validity range of common WEC modeling approaches that rely heavily on the Froude scaling and the inviscid potential flow theory. Additionally, the simulations examine the role of the Keulegan-Carpenter (KC) number, which is often used as a measure of relative importance of viscous drag on bodies exposed to oscillating flows. The performance of the flap-type WECs is investigated at various KC numbers to establish the relationship between the viscous drag and KC number for such geometry. That is of significant importance because such relationship only exists for simple geometries, e.g., a cylinder. Support from the National Science Foundation is gratefully acknowledged.

  6. Evaluation of the Community Multi-scale Air Quality (CMAQ) Model Version 5.1

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  7. Overview and Evaluation of the Community Multiscale Air Quality Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  8. Evaluation of the Community Multi-scale Air Quality (CMAQ) Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  9. Rethinking Technology-Enhanced Physics Teacher Education: From Theory to Practice

    ERIC Educational Resources Information Center

    Milner-Bolotin, Marina

    2016-01-01

    This article discusses how modern technology, such as electronic response systems, PeerWise system, data collection and analysis tools, computer simulations, and modeling software can be used in physics methods courses to promote teacher-candidates' professional competencies and their positive attitudes about mathematics and science education. We…

  10. An On-Line Classroom for the Unix Environment.

    ERIC Educational Resources Information Center

    Scigliano, John A.; And Others

    This paper describes an electronic classroom (ECR) program that has been developed at Nova University to facilitate online real-time group instruction in graduate degree programs in information and computer science. The first section describes the educational uses of the program, including the simulation of a classroom-type educational setting…

  11. Virtual Reality: A Dream Come True or a Nightmare.

    ERIC Educational Resources Information Center

    Cornell, Richard; Bailey, Dan

    Virtual Reality (VR) is a new medium which allows total stimulation of one's senses through human/computer interfaces. VR has applications in training simulators, nano-science, medicine, entertainment, electronic technology, and manufacturing. This paper focuses on some current and potential problems of virtual reality and virtual environments…

  12. Collaborative Approach in Software Engineering Education: An Interdisciplinary Case

    ERIC Educational Resources Information Center

    Vicente, Aileen Joan; Tan, Tiffany Adelaine; Yu, Alvin Ray

    2018-01-01

    Aim/Purpose: This study was aimed at enhancing students' learning of software engineering methods. A collaboration between the Computer Science, Business Management, and Product Design programs was formed to work on actual projects with real clients. This interdisciplinary form of collaboration simulates the realities of a diverse Software…

  13. Molecular mechanics and dynamics characterization of an in silico mutated protein: a stand-alone lab module or support activity for in vivo and in vitro analyses of targeted proteins.

    PubMed

    Chiang, Harry; Robinson, Lucy C; Brame, Cynthia J; Messina, Troy C

    2013-01-01

    Over the past 20 years, the biological sciences have increasingly incorporated chemistry, physics, computer science, and mathematics to aid in the development and use of mathematical models. Such combined approaches have been used to address problems from protein structure-function relationships to the workings of complex biological systems. Computer simulations of molecular events can now be accomplished quickly and with standard computer technology. Also, simulation software is freely available for most computing platforms, and online support for the novice user is ample. We have therefore created a molecular dynamics laboratory module to enhance undergraduate student understanding of molecular events underlying organismal phenotype. This module builds on a previously described project in which students use site-directed mutagenesis to investigate functions of conserved sequence features in members of a eukaryotic protein kinase family. In this report, we detail the laboratory activities of a MD module that provide a complement to phenotypic outcomes by providing a hypothesis-driven and quantifiable measure of predicted structural changes caused by targeted mutations. We also present examples of analyses students may perform. These laboratory activities can be integrated with genetics or biochemistry experiments as described, but could also be used independently in any course that would benefit from a quantitative approach to protein structure-function relationships. Copyright © 2013 Wiley Periodicals, Inc.

  14. Topics in computational physics

    NASA Astrophysics Data System (ADS)

    Monville, Maura Edelweiss

    Computational Physics spans a broad range of applied fields extending beyond the border of traditional physics tracks. Demonstrated flexibility and capability to switch to a new project, and pick up the basics of the new field quickly, are among the essential requirements for a computational physicist. In line with the above mentioned prerequisites, my thesis described the development and results of two computational projects belonging to two different applied science areas. The first project is a Materials Science application. It is a prescription for an innovative nano-fabrication technique that is built out of two other known techniques. The preliminary results of the simulation of this novel nano-patterning fabrication method show an average improvement, roughly equal to 18%, with respect to the single techniques it draws on. The second project is a Homeland Security application aimed at preventing smuggling of nuclear material at ports of entry. It is concerned with a simulation of an active material interrogation system based on the analysis of induced photo-nuclear reactions. This project consists of a preliminary evaluation of the photo-fission implementation in the more robust radiation transport Monte Carlo codes, followed by the customization and extension of MCNPX, a Monte Carlo code developed in Los Alamos National Laboratory, and MCNP-PoliMi. The final stage of the project consists of testing the interrogation system against some real world scenarios, for the purpose of determining the system's reliability, material discrimination power, and limitations.

  15. Computer Applications in Health Science Education.

    PubMed

    Juanes, Juan A; Ruisoto, Pablo

    2015-09-01

    In recent years, computer application development has experienced exponential growth, not only in the number of publications but also in the scope or contexts that have benefited from its use. In health science training, and medicine specifically, the gradual incorporation of technological developments has transformed the teaching and learning process, resulting in true "educational technology". The goal of this paper is to review the main features involved in these applications and highlight the main lines of research for the future. The results of peer reviewed literature published recently indicate the following features shared by the key technological developments in the field of health science education: first, development of simulation and visualization systems for a more complete and realistic representation of learning material over traditional paper format; second, portability and versatility of the applications, adapted for an increasing number of devices and operative systems; third, increasing focus on open source applications such as Massive Open Online Course (MOOC).

  16. The SEL Adapts to Meet Changing Times

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose S.; Basili, Victor R.

    1997-01-01

    Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. It has done this by developing and refining a continual process improvement approach that allows an organization such as the FDD to fine-tune its process for its particular domain. Experimental software engineering and measurement play a significant role in this approach. The SEL is a partnership of NASA Goddard, its major software contractor, Computer Sciences Corporation (CSC), and the University of Maryland's (LTM) Department of Computer Science. The FDD primarily builds software systems that provide ground-based flight dynamics support for scientific satellites. They fall into two sets: ground systems and simulators. Ground systems are midsize systems that average around 250 thousand source lines of code (KSLOC). Ground system development projects typically last 1 - 2 years. Recent systems have been rehosted to workstations from IBM mainframes, and also contain significant new subsystems written in C and C++. The simulators are smaller systems averaging around 60 KSLOC that provide the test data for the ground systems. Simulator development lasts up to 1 year. Most of the simulators have been built in Ada on workstations. The SEL is responsible for the management and continual improvement of the software engineering processes used on these FDD projects.

  17. Toward 10-km mesh global climate simulations

    NASA Astrophysics Data System (ADS)

    Ohfuchi, W.; Enomoto, T.; Takaya, K.; Yoshioka, M. K.

    2002-12-01

    An atmospheric general circulation model (AGCM) that runs very efficiently on the Earth Simulator (ES) was developed. The ES is a gigantic vector-parallel computer with the peak performance of 40 Tflops. The AGCM, named AFES (AGCM for ES), was based on the version 5.4.02 of an AGCM developed jointly by the Center for Climate System Research, the University of Tokyo and the Japanese National Institute for Environmental Sciences. The AFES was, however, totally rewritten in FORTRAN90 and MPI while the original AGCM was written in FORTRAN77 and not capable of parallel computing. The AFES achieved 26 Tflops (about 65 % of the peak performance of the ES) at resolution of T1279L96 (10-km horizontal resolution and 500-m vertical resolution in middle troposphere to lower stratosphere). Some results of 10- to 20-day global simulations will be presented. At this moment, only short-term simulations are possible due to data storage limitation. As ten tera flops computing is achieved, peta byte data storage are necessary to conduct climate-type simulations at this super-high resolution global simulations. Some possibilities for future research topics in global super-high resolution climate simulations will be discussed. Some target topics are mesoscale structures and self-organization of the Baiu-Meiyu front over Japan, cyclogenecsis over the North Pacific and typhoons around the Japan area. Also improvement in local precipitation with increasing horizontal resolution will be demonstrated.

  18. Scout: high-performance heterogeneous computing made simple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice

    2011-01-26

    Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less

  19. Virtual Observatory and Distributed Data Mining

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2012-03-01

    New modes of discovery are enabled by the growth of data and computational resources (i.e., cyberinfrastructure) in the sciences. This cyberinfrastructure includes structured databases, virtual observatories (distributed data, as described in Section 20.2.1 of this chapter), high-performance computing (petascale machines), distributed computing (e.g., the Grid, the Cloud, and peer-to-peer networks), intelligent search and discovery tools, and innovative visualization environments. Data streams from experiments, sensors, and simulations are increasingly complex and growing in volume. This is true in most sciences, including astronomy, climate simulations, Earth observing systems, remote sensing data collections, and sensor networks. At the same time, we see an emerging confluence of new technologies and approaches to science, most clearly visible in the growing synergism of the four modes of scientific discovery: sensors-modeling-computing-data (Eastman et al. 2005). This has been driven by numerous developments, including the information explosion, development of large-array sensors, acceleration in high-performance computing (HPC) power, advances in algorithms, and efficient modeling techniques. Among these, the most extreme is the growth in new data. Specifically, the acquisition of data in all scientific disciplines is rapidly accelerating and causing a data glut (Bell et al. 2007). It has been estimated that data volumes double every year—for example, the NCSA (National Center for Supercomputing Applications) reported that their users cumulatively generated one petabyte of data over the first 19 years of NCSA operation, but they then generated their next one petabyte in the next year alone, and the data production has been growing by almost 100% each year after that (Butler 2008). The NCSA example is just one of many demonstrations of the exponential (annual data-doubling) growth in scientific data collections. In general, this putative data-doubling is an inevitable result of several compounding factors: the proliferation of data-generating devices, sensors, projects, and enterprises; the 18-month doubling of the digital capacity of these microprocessor-based sensors and devices (commonly referred to as "Moore’s law"); the move to digital for nearly all forms of information; the increase in human-generated data (both unstructured information on the web and structured data from experiments, models, and simulation); and the ever-expanding capability of higher density media to hold greater volumes of data (i.e., data production expands to fill the available storage space). These factors are consequently producing an exponential data growth rate, which will soon (if not already) become an insurmountable technical challenge even with the great advances in computation and algorithms. This technical challenge is compounded by the ever-increasing geographic dispersion of important data sources—the data collections are not stored uniformly at a single location, or with a single data model, or in uniform formats and modalities (e.g., images, databases, structured and unstructured files, and XML data sets)—the data are in fact large, distributed, heterogeneous, and complex. The greatest scientific research challenge with these massive distributed data collections is consequently extracting all of the rich information and knowledge content contained therein, thus requiring new approaches to scientific research. This emerging data-intensive and data-oriented approach to scientific research is sometimes called discovery informatics or X-informatics (where X can be any science, such as bio, geo, astro, chem, eco, or anything; Agresti 2003; Gray 2003; Borne 2010). This data-oriented approach to science is now recognized by some (e.g., Mahootian and Eastman 2009; Hey et al. 2009) as the fourth paradigm of research, following (historically) experiment/observation, modeling/analysis, and computational science.

  20. Making On-line Science Course Materials Easily Translatable and Accessible Worldwide: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.

    2012-02-01

    The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET project team overcame this challenge by creating the Translation Utility. This tool allows a person fluent in both English and another language to easily translate any of the PhET simulations and requires minimal computer expertise. In this paper we discuss the technical issues involved in this software solution, as well as the issues involved in obtaining accurate translations. We share our solutions to many of the unexpected problems we encountered that would apply generally to making on-line scientific course materials available in many different languages, including working with: languages written right-to-left, different character sets, and different conventions for expressing equations, variables, units and scientific notation.

  1. FACE-IT. A Science Gateway for Food Security Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montella, Raffaele; Kelly, David; Xiong, Wei

    Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less

  2. BCTC for Windows: Abstract of Issue 9903W

    NASA Astrophysics Data System (ADS)

    Whisnant, David M.; McCormick, James A.

    1999-05-01

    BCTC for Windows was originally published by JCE Software in 1992 (1) in Series B for PC-compatible (MS-DOS) computers. JCE Software is now re-releasing BCTC for Windows as issue 9903W to make it more accessible to Windows users-especially those running Windows 95 and Windows 98-while we continue to phase out Series B (DOS) issues. Aside from a new Windows-compatible installation program, BCTC is unchanged. BCTC is an environmental simulation modeled after the dioxin controversy (2). In the simulation, students are involved in the investigation of a suspected carcinogen called BCTC, which has been found in a river below a chemical plant and above the water supply of a nearby city. The students have the options of taking water samples, analyzing the water (for BCTC, oxygen, metals, and pesticides), determining LD50s in an animal lab, visiting a library, making economic analyses, and conferring with colleagues, all using the computer. In the Classroom BCTC gives students experience with science in the context of a larger social and political problem. It can serve as the basis for a scientific report, class discussion, or a role-playing exercise (3). Because it requires no previous laboratory experience, this simulation can be used by students in middle and high school science classes, or in college courses for non-science majors. It also has been used in introductory chemistry courses for science majors. One of the intentions of BCTC is to involve students in an exercise (2) that closely approximates what scientists do. The realistic pictures, many of them captured with a video camera, create an atmosphere that furthers this goal. BCTC also reflects the comments of teachers who have used the program (4) and accounts of dioxin research (5). Screen from BCTC showing location of the entry of the effluent in the river, the city, and the city water supply.

    Acknowledgments Support for this project was provided by NSF Grant USE-9151873 and by a BellSouth Foundation Grant. Literature Cited 1. Whisnant, D. M.; McCormick, J. A. BCTC for Windows; J. Chem. Educ. Software 1992, 5B2. 2. Whisnant, D. M. J. Chem. Educ. 1984, 61, 627-629. 3. Whisnant, D. M. J. Chem. Educ. 1992, 69, 42. 4. Camille and Henry Dreyfus Institute on the Chemistry of Water, 1990; Institute for Chemical Education Summer Workshops, University of Wisconsin-Madison, 1991. 5. Roberts, L. Science 1991, 251, 624-626; ibid, 254, 377. Keywords Computer Room; Simulation; High School; General; General Science; Environmental Chemistry; Chemistry and Society; Water Chemistry Hardware and Software Requirements for BCTC for Windows

  3. Integrating Xgrid into the HENP distributed computing model

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  4. Virtual Disaster Simulation: Lesson Learned from an International Collaboration That Can Be Leveraged for Disaster Education in Iran.

    PubMed

    Ardalan, Ali; Balikuddembe, Joseph Kimuli; Ingrassia, Pier Luigi; Carenzo, Luca; Della Corte, Francesco; Akbarisari, Ali; Djalali, Ahmadreza

    2015-07-13

    Disaster education needs innovative educational methods to be more effective compared to traditional approaches. This can be done by using virtual simulation method. This article presents an experience about using virtual simulation methods to teach health professional on disaster medicine in Iran. The workshop on the "Application of New Technologies in Disaster Management Simulation" was held in Tehran in January 2015. It was co-organized by the Disaster and Emergency Health Academy of Tehran University of Medical Sciences and Emergency and the Research Center in Disaster Medicine and Computer Science applied to Medicine (CRIMEDIM), Università del Piemonte Orientale. Different simulators were used by the participants, who were from the health system and other relevant fields, both inside and outside Iran. As a result of the workshop, all the concerned stakeholders are called on to support this new initiative of incorporating virtual training and exercise simulation in the field of disaster medicine, so that its professionals are endowed with field-based and practical skills in Iran and elsewhere. Virtual simulation technology is recommended to be used in education of disaster management. This requires capacity building of instructors, and provision of technologies. International collaboration can facilitate this process.

  5. Thermodynamic forces in coarse-grained simulations

    NASA Astrophysics Data System (ADS)

    Noid, William

    Atomically detailed molecular dynamics simulations have profoundly advanced our understanding of the structure and interactions in soft condensed phases. Nevertheless, despite dramatic advances in the methodology and resources for simulating atomically detailed models, low-resolution coarse-grained (CG) models play a central and rapidly growing role in science. CG models not only empower researchers to investigate phenomena beyond the scope of atomically detailed simulations, but also to precisely tailor models for specific phenomena. However, in contrast to atomically detailed simulations, which evolve on a potential energy surface, CG simulations should evolve on a free energy surface. Therefore, the forces in CG models should reflect the thermodynamic information that has been eliminated from the CG configuration space. As a consequence of these thermodynamic forces, CG models often demonstrate limited transferability and, moreover, rarely provide an accurate description of both structural and thermodynamic properties. In this talk, I will present a framework that clarifies the origin and impact of these thermodynamic forces. Additionally, I will present computational methods for quantifying these forces and incorporating their effects into CG MD simulations. As time allows, I will demonstrate applications of this framework for liquids, polymers, and interfaces. We gratefully acknowledge the support of the National Science Foundation via CHE 1565631.

  6. Local Aqueous Solvation Structure Around Ca2+ During Ca2+---Cl– Pair Formation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Marcel D.; Mundy, Christopher J.

    2016-03-03

    The molecular details of single ion solvation around Ca2+ and ion-pairing of Ca2--Cl- are investigated using ab initio molecular dynamics. The use of empirical dispersion corrections to the BLYP functional are investigated by comparison to experimentally available extended X-ray absorption fine structure (EXAFS) measurements, which probes the first solvation shell in great detail. Besides finding differences in the free-energy for both ion-pairing and the coordination number of ion solvation between the quantum and classical descriptions of interaction, there were important differences found between dispersion corrected and uncorrected density functional theory (DFT). Specifically, we show significantly different free-energy landscapes for bothmore » coordination number of Ca2+ and its ion-pairing with Cl- depending on the DFT simulation protocol. Our findings produce a self-consistent treatment of short-range solvent response to the ion and the intermediate to long-range collective response of the electrostatics of the ion-ion interaction to produce a detailed picture of ion-pairing that is consistent with experiment. MDB is supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative at Pacific Northwest National Laboratory. It was conducted under the Laboratory Directed Research and Development Program at PNNL, a multiprogram national laboratory operated by Battelle for the U.S. Department of Energy. CJM acknowledges support from US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. This research used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Additional computing resources were generously allocated by PNNL's Institutional Computing program. The authors thank Prof. Tom Beck for discussions regarding QCT, and Drs. Greg Schenter and Shawn Kathmann for insightful comments.« less

  7. Bringing history to life: simulating landmark experiments in psychology.

    PubMed

    Boynton, David M; Smith, Laurence D

    2006-05-01

    The course in history of psychology can be challenging for students, many of whom enter it with little background in history and faced with unfamiliar names and concepts. The sheer volume of material can encourage passive memorization unless efforts are made to increase student involvement. As part of a trend toward experiential history, historians of science have begun to supplement their lectures with demonstrations of classic physics experiments as a way to bring the history of science to life. Here, the authors report on computer simulations of five landmark experiments from early experimental psychology in the areas of reaction time, span of attention, and apparent motion. The simulations are designed not only to permit hands-on replication of historically important results but also to reproduce the experimental procedures closely enough that students can gain a feel for the nature of early research and the psychological processes being studied.

  8. FLASH Interface; a GUI for managing runtime parameters in FLASH simulations

    NASA Astrophysics Data System (ADS)

    Walker, Christopher; Tzeferacos, Petros; Weide, Klaus; Lamb, Donald; Flocke, Norbert; Feister, Scott

    2017-10-01

    We present FLASH Interface, a novel graphical user interface (GUI) for managing runtime parameters in simulations performed with the FLASH code. FLASH Interface supports full text search of available parameters; provides descriptions of each parameter's role and function; allows for the filtering of parameters based on categories; performs input validation; and maintains all comments and non-parameter information already present in existing parameter files. The GUI can be used to edit existing parameter files or generate new ones. FLASH Interface is open source and was implemented with the Electron framework, making it available on Mac OSX, Windows, and Linux operating systems. The new interface lowers the entry barrier for new FLASH users and provides an easy-to-use tool for experienced FLASH simulators. U.S. Department of Energy (DOE), NNSA ASC/Alliances Center for Astrophysical Thermonuclear Flashes, U.S. DOE NNSA ASC through the Argonne Institute for Computing in Science, U.S. National Science Foundation.

  9. Trinity Phase 2 Open Science: CTH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggirello, Kevin Patrick; Vogler, Tracy

    CTH is an Eulerian hydrocode developed by Sandia National Laboratories (SNL) to solve a wide range of shock wave propagation and material deformation problems. Adaptive mesh refinement is also used to improve efficiency for problems with a wide range of spatial scales. The code has a history of running on a variety of computing platforms ranging from desktops to massively parallel distributed-data systems. For the Trinity Phase 2 Open Science campaign, CTH was used to study mesoscale simulations of the hypervelocity penetration of granular SiC powders. The simulations were compared to experimental data. A scaling study of CTH up tomore » 8192 KNL nodes was also performed, and several improvements were made to the code to improve the scalability.« less

  10. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  11. Atomistic Simulations of High-intensity XFEL Pulses on Diffractive Imaging of Nano-sized System Dynamics

    NASA Astrophysics Data System (ADS)

    Ho, Phay; Knight, Christopher; Bostedt, Christoph; Young, Linda; Tegze, Miklos; Faigel, Gyula

    2016-05-01

    We have developed a large-scale atomistic computational method based on a combined Monte Carlo and Molecular Dynamics (MC/MD) method to simulate XFEL-induced radiation damage dynamics of complex materials. The MD algorithm is used to propagate the trajectories of electrons, ions and atoms forward in time and the quantum nature of interactions with an XFEL pulse is accounted for by a MC method to calculate probabilities of electronic transitions. Our code has good scalability with MPI/OpenMP parallelization, and it has been run on Mira, a petascale system at the Argonne Leardership Computing Facility, with particle number >50 million. Using this code, we have examined the impact of high-intensity 8-keV XFEL pulses on the x-ray diffraction patterns of argon clusters. The obtained patterns show strong pulse parameter dependence, providing evidence of significant lattice rearrangement and diffuse scattering. Real-space electronic reconstruction was performed using phase retrieval methods. We found that the structure of the argon cluster can be recovered with atomic resolution even in the presence of considerable radiation damage. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Chemical Sciences, Geosciences, and Biosciences Division.

  12. Research Reports: 1988 NASA/ASEE Summer Faculty Fellowship Program

    NASA Technical Reports Server (NTRS)

    Freeman, L. Michael (Editor); Chappell, Charles R. (Editor); Cothran, Ernestine K. (Editor); Karr, Gerald R. (Editor)

    1988-01-01

    The basic objectives are to further the professional knowledge of qualified engineering and science faculty members; to stimulate an exchange of ideas between participants and NASA: to enrich and refresh the research and teaching activities of the participants' institutions; and to contribute to the research objectives of the NASA centers. Topics addressed include: cryogenics; thunderstorm simulation; computer techniques; computer assisted instruction; system analysis weather forecasting; rocket engine design; crystal growth; control systems design; turbine pumps for the Space Shuttle Main engine; electron mobility; heat transfer predictions; rotor dynamics; mathematical models; computational fluid dynamics; and structural analysis.

  13. Cellular automaton supercomputing

    NASA Technical Reports Server (NTRS)

    Wolfram, Stephen

    1987-01-01

    Many of the models now used in science and engineering are over a century old. And most of them can be implemented on modern digital computers only with considerable difficulty. Some new basic models are discussed which are much more directly suitable for digital computer simulation. The fundamental principle is that the models considered herein are as suitable as possible for implementation on digital computers. It is then a matter of scientific analysis to determine whether such models can reproduce the behavior seen in physical and other systems. Such analysis was carried out in several cases, and the results are very encouraging.

  14. SciDAC GSEP: Gyrokinetic Simulation of Energetic Particle Turbulence and Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Zhihong

    Energetic particle (EP) confinement is a key physics issue for burning plasma experiment ITER, the crucial next step in the quest for clean and abundant energy, since ignition relies on self-heating by energetic fusion products (α-particles). Due to the strong coupling of EP with burning thermal plasmas, plasma confinement property in the ignition regime is one of the most uncertain factors when extrapolating from existing fusion devices to the ITER tokamak. EP population in current tokamaks are mostly produced by auxiliary heating such as neutral beam injection (NBI) and radio frequency (RF) heating. Remarkable progress in developing comprehensive EP simulationmore » codes and understanding basic EP physics has been made by two concurrent SciDAC EP projects GSEP funded by the Department of Energy (DOE) Office of Fusion Energy Science (OFES), which have successfully established gyrokinetic turbulence simulation as a necessary paradigm shift for studying the EP confinement in burning plasmas. Verification and validation have rapidly advanced through close collaborations between simulation, theory, and experiment. Furthermore, productive collaborations with computational scientists have enabled EP simulation codes to effectively utilize current petascale computers and emerging exascale computers. We review here key physics progress in the GSEP projects regarding verification and validation of gyrokinetic simulations, nonlinear EP physics, EP coupling with thermal plasmas, and reduced EP transport models. Advances in high performance computing through collaborations with computational scientists that enable these large scale electromagnetic simulations are also highlighted. These results have been widely disseminated in numerous peer-reviewed publications including many Phys. Rev. Lett. papers and many invited presentations at prominent fusion conferences such as the biennial International Atomic Energy Agency (IAEA) Fusion Energy Conference and the annual meeting of the American Physics Society, Division of Plasma Physics (APS-DPP).« less

  15. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  16. Autonomous Driver Based on an Intelligent System of Decision-Making.

    PubMed

    Czubenko, Michał; Kowalczuk, Zdzisław; Ordys, Andrew

    The paper presents and discusses a system ( xDriver ) which uses an Intelligent System of Decision-making (ISD) for the task of car driving. The principal subject is the implementation, simulation and testing of the ISD system described earlier in our publications (Kowalczuk and Czubenko in artificial intelligence and soft computing lecture notes in computer science, lecture notes in artificial intelligence, Springer, Berlin, 2010, 2010, In Int J Appl Math Comput Sci 21(4):621-635, 2011, In Pomiary Autom Robot 2(17):60-5, 2013) for the task of autonomous driving. The design of the whole ISD system is a result of a thorough modelling of human psychology based on an extensive literature study. Concepts somehow similar to the ISD system can be found in the literature (Muhlestein in Cognit Comput 5(1):99-105, 2012; Wiggins in Cognit Comput 4(3):306-319, 2012), but there are no reports of a system which would model the human psychology for the purpose of autonomously driving a car. The paper describes assumptions for simulation, the set of needs and reactions (characterizing the ISD system), the road model and the vehicle model, as well as presents some results of simulation. It proves that the xDriver system may behave on the road as a very inexperienced driver.

  17. Computer simulations for lab experiences in secondary physics

    NASA Astrophysics Data System (ADS)

    Murphy, David Shannon

    Physical science instruction often involves modeling natural systems, such as electricity that possess particles which are invisible to the unaided eye. The effect of these particles' motion is observable, but the particles are not directly observable to humans. Simulations have been developed in physics, chemistry and biology that, under certain circumstances, have been found to allow students to gain insight into the operation of the systems they model. This study compared the use of a DC circuit simulation, a modified simulation, static graphics, and traditional bulbs and wires to compare gains in DC circuit knowledge as measured by the DIRECT instrument, a multiple choice instrument previously developed to assess DC circuit knowledge. Gender, prior DC circuit knowledge and subsets of DC circuit knowledge of students were also compared. The population (n=166) was comprised of high school freshmen students from an eastern Kentucky public school with a population of 1100 students and followed a quantitative quasi experimental research design. Differences between treatment groups were not statistically significant. Keywords: Simulations, Static Images, Science Education, DC Circuit Instruction, Phet.

  18. Kennedy Space Center ITC-1 Internship Overview

    NASA Technical Reports Server (NTRS)

    Ni, Marcus

    2011-01-01

    As an intern for Priscilla Elfrey in the ITC-1 department, I was involved in many activities that have helped me to develop many new skills. I supported four different projects during my internship, which included the Center for Life Cycle Design (CfLCD), SISO Space Interoperability Smackdown, RTI Teacher Mentor Program, and the Discrete Event Simulation Integrated Visualization Environment Team (DIVE). I provided the CfLCD with web based research on cyber security initiatives involving simulation, education for young children, cloud computing, Otronicon, and Science, Technology, Engineering, and Mathematics (STEM) education initiatives. I also attended STEM meetings regarding simulation courses, and educational course enhancements. To further improve the SISO Simulation event, I provided observation feedback to the technical advisory board. I also helped to set up a chat federation for HLA. The third project involved the RTI Teacher Mentor program, which I helped to organize. Last, but not least, I worked with the DIVE team to develop new software to help visualize discrete event simulations. All of these projects have provided experience on an interdisciplinary level ranging from speech and communication to solving complex problems using math and science.

  19. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  20. Parallel Tensor Compression for Large-Scale Scientific Data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolda, Tamara G.; Ballard, Grey; Austin, Woody Nathan

    As parallel computing trends towards the exascale, scientific data produced by high-fidelity simulations are growing increasingly massive. For instance, a simulation on a three-dimensional spatial grid with 512 points per dimension that tracks 64 variables per grid point for 128 time steps yields 8 TB of data. By viewing the data as a dense five way tensor, we can compute a Tucker decomposition to find inherent low-dimensional multilinear structure, achieving compression ratios of up to 10000 on real-world data sets with negligible loss in accuracy. So that we can operate on such massive data, we present the first-ever distributed memorymore » parallel implementation for the Tucker decomposition, whose key computations correspond to parallel linear algebra operations, albeit with nonstandard data layouts. Our approach specifies a data distribution for tensors that avoids any tensor data redistribution, either locally or in parallel. We provide accompanying analysis of the computation and communication costs of the algorithms. To demonstrate the compression and accuracy of the method, we apply our approach to real-world data sets from combustion science simulations. We also provide detailed performance results, including parallel performance in both weak and strong scaling experiments.« less

Top