Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
ERIC Educational Resources Information Center
Qian, Yizhou; Hambrusch, Susanne; Yadav, Aman; Gretter, Sarah
2018-01-01
The new Advanced Placement (AP) Computer Science (CS) Principles course increases the need for quality CS teachers and thus the need for professional development (PD). This article presents the results of a 2-year study investigating how teachers teaching the AP CS Principles course for the first time used online PD material. Our results showed…
Semiotics, Information Science, Documents and Computers.
ERIC Educational Resources Information Center
Warner, Julian
1990-01-01
Discusses the relationship and value of semiotics to the established domains of information science. Highlights include documentation; computer operations; the language of computing; automata theory; linguistics; speech and writing; and the written language as a unifying principle for the document and the computer. (93 references) (LRW)
The Quantitative Analysis of User Behavior Online - Data, Models and Algorithms
NASA Astrophysics Data System (ADS)
Raghavan, Prabhakar
By blending principles from mechanism design, algorithms, machine learning and massive distributed computing, the search industry has become good at optimizing monetization on sound scientific principles. This represents a successful and growing partnership between computer science and microeconomics. When it comes to understanding how online users respond to the content and experiences presented to them, we have more of a lacuna in the collaboration between computer science and certain social sciences. We will use a concrete technical example from image search results presentation, developing in the process some algorithmic and machine learning problems of interest in their own right. We then use this example to motivate the kinds of studies that need to grow between computer science and the social sciences; a critical element of this is the need to blend large-scale data analysis with smaller-scale eye-tracking and "individualized" lab studies.
Computing Your Way through Science.
ERIC Educational Resources Information Center
Allen, Denise
1994-01-01
Reviews three computer software programs focusing on teaching science to middle school students: (1) Encarta, a multimedia encyclopedia; (2) Gizmos and Gadgets, which allows students to explore physical science principles; and (3) BodyScope, which allows students to examine the systems of the human body. (BB)
Is Computer Science Compatible with Technological Literacy?
ERIC Educational Resources Information Center
Buckler, Chris; Koperski, Kevin; Loveland, Thomas R.
2018-01-01
Although technology education evolved over time, and pressure increased to infuse more engineering principles and increase links to STEM (science technology, engineering, and mathematics) initiatives, there has never been an official alignment between technology and engineering education and computer science. There is movement at the federal level…
ERIC Educational Resources Information Center
Wielard, Valerie Michelle
2013-01-01
The primary objective of this project was to learn what effect a computer program would have on academic achievement and attitude toward science of college students enrolled in a biology class for non-science majors. It became apparent that the instructor also had an effect on attitudes toward science. The researcher designed a computer program,…
Graphical User Interface Programming in Introductory Computer Science.
ERIC Educational Resources Information Center
Skolnick, Michael M.; Spooner, David L.
Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…
NASA Astrophysics Data System (ADS)
Eisenbach, Markus
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code. This work has been sponsored by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Material Sciences and Engineering Division and by the Office of Advanced Scientific Computing. This work used resources of the Oak Ridge Leadership Computing Facility, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.
NASA Astrophysics Data System (ADS)
Puligheddu, Marcello; Gygi, Francois; Galli, Giulia
The prediction of the thermal properties of solids and liquids is central to numerous problems in condensed matter physics and materials science, including the study of thermal management of opto-electronic and energy conversion devices. We present a method to compute the thermal conductivity of solids by performing ab initio molecular dynamics at non equilibrium conditions. Our formulation is based on a generalization of the approach to equilibrium technique, using sinusoidal temperature gradients, and it only requires calculations of first principles trajectories and atomic forces. We discuss results and computational requirements for a representative, simple oxide, MgO, and compare with experiments and data obtained with classical potentials. This work was supported by MICCoM as part of the Computational Materials Science Program funded by the U.S. Department of Energy (DOE), Office of Science , Basic Energy Sciences (BES), Materials Sciences and Engineering Division under Grant DOE/BES 5J-30.
Bernoulli's Principle: Science as a Human Endeavor
ERIC Educational Resources Information Center
McCarthy, Deborah
2008-01-01
What do the ideas of Daniel Bernoulli--an 18th-century Swiss mathematician, physicist, natural scientist, and professor--and your students' next landing of the space shuttle via computer simulation have in common? Because of his contribution, referred in physical science as Bernoulli's principle, modern flight is possible. The mini learning-cycle…
Computational techniques in tribology and material science at the atomic level
NASA Technical Reports Server (NTRS)
Ferrante, J.; Bozzolo, G. H.
1992-01-01
Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.
Integrating Data Base into the Elementary School Science Program.
ERIC Educational Resources Information Center
Schlenker, Richard M.
This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…
ERIC Educational Resources Information Center
Good, Jonathon; Keenan, Sarah; Mishra, Punya
2016-01-01
The popular press is rife with examples of how students in the United States and around the globe are learning to program, make, and tinker. The Hour of Code, maker-education, and similar efforts are advocating that more students be exposed to principles found within computer science. We propose an expansion beyond simply teaching computational…
ERIC Educational Resources Information Center
Orey, Michael A.; Nelson, Wayne A.
Arguing that the evolution of intelligent tutoring systems better reflects the recent theoretical developments of cognitive science than traditional computer-based instruction (CBI), this paper describes a general model for an intelligent tutoring system and suggests ways to improve CBI using design principles derived from research in cognitive…
Imprinting Community College Computer Science Education with Software Engineering Principles
ERIC Educational Resources Information Center
Hundley, Jacqueline Holliday
2012-01-01
Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…
The journey from forensic to predictive materials science using density functional theory
Schultz, Peter A.
2017-09-12
Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.
The journey from forensic to predictive materials science using density functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter A.
Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.
NASA Technical Reports Server (NTRS)
Smith, Paul H.
1988-01-01
The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.
ERIC Educational Resources Information Center
Prosise, Jeff
This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…
Design Principles for "Thriving in Our Digital World": A High School Computer Science Course
ERIC Educational Resources Information Center
Veletsianos, George; Beth, Bradley; Lin, Calvin; Russell, Gregory
2016-01-01
"Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…
Principles versus Artifacts in Computer Science Curriculum Design
ERIC Educational Resources Information Center
Machanick, Philip
2003-01-01
Computer Science is a subject which has difficulty in marketing itself. Further, pinning down a standard curriculum is difficult--there are many preferences which are hard to accommodate. This paper argues the case that part of the problem is the fact that, unlike more established disciplines, the subject does not clearly distinguish the study of…
The Difficult Bridge between University and Industry: A Case Study in Computer Science Teaching
ERIC Educational Resources Information Center
Schilling, Jan; Klamma, Ralf
2010-01-01
Recently, there has been increasing criticism concerning academic computer science education. This paper presents a new approach based on the principles of constructivist learning design as well as the ideas of knowledge transfer in communities of practice. The course "High-tech Entrepreneurship and New Media" was introduced as an…
The role of physicality in rich programming environments
NASA Astrophysics Data System (ADS)
Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin
2013-12-01
Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot Virtual Worlds (RVWs), can be used to teach computer science principles within a robotics context by examining its use in high-school classrooms. We also investigated whether the lack of physicality in these environments impacts student learning by comparing classrooms that used either virtual or physical robots for the RVW curriculum. Results suggest that the RVW environment leads to significant gains in computer science knowledge, that virtual robots lead to faster learning, and that physical robots may have some influence on algorithmic thinking. We discuss the implications of physicality in these programming environments for learning computer science.
SEED: A Suite of Instructional Laboratories for Computer Security Education
ERIC Educational Resources Information Center
Du, Wenliang; Wang, Ronghua
2008-01-01
The security and assurance of our computing infrastructure has become a national priority. To address this priority, higher education has gradually incorporated the principles of computer and information security into the mainstream undergraduate and graduate computer science curricula. To achieve effective education, learning security principles…
Evaluation of the Effectiveness of a Web-Based Learning Design for Adult Computer Science Courses
ERIC Educational Resources Information Center
Antonis, Konstantinos; Daradoumis, Thanasis; Papadakis, Spyros; Simos, Christos
2011-01-01
This paper reports on work undertaken within a pilot study concerned with the design, development, and evaluation of online computer science training courses. Drawing on recent developments in e-learning technology, these courses were structured around the principles of a learner-oriented approach for use with adult learners. The paper describes a…
Astrobiology for the 21st Century
NASA Astrophysics Data System (ADS)
Oliveira, C.
2008-02-01
We live in a scientific world. Science is all around us. We take scientific principles for granted every time we use a piece of technological apparatus, such as a car, a computer, or a cellphone. In today's world, citizens frequently have to make decisions that require them to have some basic scientific knowledge. To be a contributing citizen in a modern democracy, a person needs to understand the general principles of science.
ERIC Educational Resources Information Center
Jackett, Dwane
1990-01-01
Described is a science activity which illustrates the principle of uncertainty using a computer simulation of bacterial reproduction. Procedures and results are discussed. Several illustrations of results are provided. The availability of a computer program is noted. (CW)
Computer Animations a Science Teaching Aid: Contemplating an Effective Methodology
ERIC Educational Resources Information Center
Tannu, Kirti
2008-01-01
To improve quality of science education, the author suggests use of entertaining and exciting technique of animation for better understanding of scientific principles. Latest technologies are being used with more vigour to spread venomous superstitions. Better understanding of science may help students to better their scientific temper. Keeping…
NASA Technical Reports Server (NTRS)
Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.
1987-01-01
This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.
Computer Science in K-12 School Curricula of the 2lst Century: Why, What and When?
ERIC Educational Resources Information Center
Webb, Mary; Davis, Niki; Bell, Tim; Katz, Yaacov J.; Reynolds, Nicholas; Chambers, Dianne P.; Syslo, Maciej M.
2017-01-01
In this paper we have examined the position and roles of Computer Science in curricula in the light of recent calls for curriculum change and we have proposed principles and issues to consider in curriculum design as well as identifying priority areas for further research. The paper is based on discussions within and beyond the International…
ERIC Educational Resources Information Center
Lévano, Marcos; Albornoz, Andrea
2016-01-01
This paper aims to propose a framework to improve the quality in teaching and learning in order to develop good practices to train professionals in the career of computer engineering science. To demonstrate the progress and achievements, our work is based on two principles for the formation of professionals, one based on the model of learning…
ERIC Educational Resources Information Center
School Science Review, 1985
1985-01-01
Presents 23 experiments, demonstrations, activities, and computer programs in biology, chemistry, and physics. Topics include lead in petrol, production of organic chemicals, reduction of water, enthalpy, X-ray diffraction model, nuclear magnetic resonance spectroscopy, computer simulation for additive mixing of colors, Archimedes Principle, and…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2012
2012-01-01
Science education is not just about learning facts in a classroom--it's about doing activities where students put their understanding of science principles into action. That's why two unique types of activity-based tasks were administered as part of the 2009 National Assessment of Educational Progress (NAEP) science assessment. In addition to the…
Retrospective Evaluation of a Collaborative LearningScience Module: The Users' Perspective
ERIC Educational Resources Information Center
DeWitt, Dorothy; Siraj, Saedah; Alias, Norlidah; Leng, Chin Hai
2013-01-01
This study focuses on the retrospective evaluation of collaborative mLearning (CmL) Science module for teaching secondary school science which was designed based on social constructivist learning theories and Merrill's First Principle of Instruction. This study is part of a developmental research in which computer-mediated communication (CMC)…
This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...
Fist Principles Approach to the Magneto Caloric Effect: Application to Ni2MnGa
NASA Astrophysics Data System (ADS)
Odbadrakh, Khorgolkhuu; Nicholson, Don; Rusanu, Aurelian; Eisenbach, Markus; Brown, Gregory; Evans, Boyd, III
2011-03-01
The magneto-caloric effect (MCE) has potential application in heating and cooling technologies. In this work, we present calculated magnetic structure of a candidate MCE material, Ni 2 MnGa. The magnetic configurations of a 144 atom supercell is first explored using first-principle, the results are then used to fit exchange parameters of a Heisenberg Hamiltonian. The Wang-Landau method is used to calculate the magnetic density of states of the Heisenberg Hamiltonian. Based on this classical estimate, the magnetic density of states is calculated using the Wang Landau method with energies obtained from the first principles method. The Currie temperature and other thermodynamic properties are calculated using the density of states. The relationships between the density of magnetic states and the field induced adiabatic temperature change and isothermal entropy change are discussed. This work was sponsored by the Laboratory Directed Research and Development Program (ORNL), by the Mathematical, Information, and Computational Sciences Division; Office of Advanced Scientific Computing Research (US DOE), and by the Materials Sciences and Engineering Division; Office of Basic Energy Sciences (US DOE).
How Science Students Can Learn about Unobservable Phenomena Using Computer-Based Analogies
ERIC Educational Resources Information Center
Trey, L.; Khan, S.
2008-01-01
A novel instructional computer simulation that incorporates a dynamic analogy to represent Le Chatelier's Principle was designed to investigate the contribution of this feature to students' understanding. Two groups of 12th grade Chemistry students (n=15) interacted with the computer simulation during the study. Both groups did the same…
ERIC Educational Resources Information Center
da Silveira, Pedro Rodrigo Castro
2014-01-01
This thesis describes the development and deployment of a cyberinfrastructure for distributed high-throughput computations of materials properties at high pressures and/or temperatures--the Virtual Laboratory for Earth and Planetary Materials--VLab. VLab was developed to leverage the aggregated computational power of grid systems to solve…
2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions
2017-12-21
modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data
An Educational Development Tool Based on Principles of Formal Ontology
ERIC Educational Resources Information Center
Guzzi, Rodolfo; Scarpanti, Stefano; Ballista, Giovanni; Di Nicolantonio, Walter
2005-01-01
Computer science provides with virtual laboratories, places where one can merge real experiments with the formalism of algorithms and mathematics and where, with the advent of multimedia, sounds and movies can also be added. In this paper we present a method, based on principles of formal ontology, allowing one to develop interactive educational…
Cognitive Computational Neuroscience: A New Conference for an Emerging Discipline.
Naselaris, Thomas; Bassett, Danielle S; Fletcher, Alyson K; Kording, Konrad; Kriegeskorte, Nikolaus; Nienborg, Hendrikje; Poldrack, Russell A; Shohamy, Daphna; Kay, Kendrick
2018-05-01
Understanding the computational principles that underlie complex behavior is a central goal in cognitive science, artificial intelligence, and neuroscience. In an attempt to unify these disconnected communities, we created a new conference called Cognitive Computational Neuroscience (CCN). The inaugural meeting revealed considerable enthusiasm but significant obstacles remain. Copyright © 2018 Elsevier Ltd. All rights reserved.
Computer Science Research Funding: How Much Is Too Little?
2009-06-01
Bioinformatics Parallel computing Computational biology Principles of programming Computational neuroscience Real-time and embedded systems Scientific...National Security Agency ( NSA ) • Missile Defense Agency (MDA) and others The various research programs have been coordinated through the DDR&E...DOD funding included only DARPA and OSD programs. FY07 and FY08 PBR funding included DARPA, NSA , some of the Services’ basic and applied research
The simplicity principle in perception and cognition
Feldman, Jacob
2016-01-01
The simplicity principle, traditionally referred to as Occam’s razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations— or, more precisely, that it balances a bias towards simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. PMID:27470193
A CS1 Pedagogical Approach to Parallel Thinking
ERIC Educational Resources Information Center
Rague, Brian William
2010-01-01
Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within…
Making Construals as a New Digital Skill for Learning
ERIC Educational Resources Information Center
Beynon, Meurig; Boyatt, Russell; Foss, Jonathan; Hall, Chris; Hudnott, Elizabeth; Russ, Steve; Sutinen, Erkki; Macleod, Hamish; Kommers, Piet
2015-01-01
Making construals is a practical approach to computing that was originally developed for and by computer science undergraduates. It is the central theme of an EU project aimed at disseminating the relevant principles to a broader audience. This involves bringing together technical experts in making construals and international experts in…
Mass storage system experiences and future needs at the National Center for Atmospheric Research
NASA Technical Reports Server (NTRS)
Olear, Bernard T.
1991-01-01
A summary and viewgraphs of a discussion presented at the National Space Science Data Center (NSSDC) Mass Storage Workshop is included. Some of the experiences of the Scientific Computing Division at the National Center for Atmospheric Research (NCAR) dealing the the 'data problem' are discussed. A brief history and a development of some basic mass storage system (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. Future MSS needs for future computing environments is discussed.
Bioinspired principles for large-scale networked sensor systems: an overview.
Jacobsen, Rune Hylsberg; Zhang, Qi; Toftegaard, Thomas Skjødeberg
2011-01-01
Biology has often been used as a source of inspiration in computer science and engineering. Bioinspired principles have found their way into network node design and research due to the appealing analogies between biological systems and large networks of small sensors. This paper provides an overview of bioinspired principles and methods such as swarm intelligence, natural time synchronization, artificial immune system and intercellular information exchange applicable for sensor network design. Bioinspired principles and methods are discussed in the context of routing, clustering, time synchronization, optimal node deployment, localization and security and privacy.
2000 FIRST Robotics Competition
NASA Technical Reports Server (NTRS)
Purman, Richard
2000-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2000 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2000-02-01
DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.
The Phenomenal World of Physics. The Science Club. Ages 10-14. [CD-ROM].
ERIC Educational Resources Information Center
1999
This CD-ROM allows students to learn about physics principles and the scientists who discovered them through genius or luck. The simplicity of these physical laws and how the discovery of these laws has improved the daily lives of humans is discussed. The computer program explores the physics behind the earth's rotation, Archimedes' Principles,…
Synthetic Biology: Knowledge Accessed by Everyone (Open Sources)
ERIC Educational Resources Information Center
Sánchez Reyes, Patricia Margarita
2016-01-01
Using the principles of biology, along with engineering and with the help of computer, scientists manage to copy. DNA sequences from nature and use them to create new organisms. DNA is created through engineering and computer science managing to create life inside a laboratory. We cannot dismiss the role that synthetic biology could lead in…
Principles for Integrating Mars Analog Science, Operations, and Technology Research
NASA Technical Reports Server (NTRS)
Clancey, William J.
2003-01-01
During the Apollo program, the scientific community and NASA used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. Human factors studies (Harrison, Clearwater, & McKay 1991; Stuster 1996) have focused on the effects of isolation in extreme environments. More recently, with the advent of wireless computing, we have prototyped advanced EVA technologies for navigation, scheduling, and science data logging (Clancey 2002b; Clancey et al., in press). Combining these interests in a single expedition enables tremendous synergy and authenticity, as pioneered by Pascal Lee's Haughton-Mars Project (Lee 2001; Clancey 2000a) and the Mars Society s research stations on a crater rim on Devon Island in the High Canadian Arctic (Clancey 2000b; 2001b) and the Morrison Formation of southeast Utah (Clancey 2002a). Based on this experience, the following principles are proposed for conducting an integrated science, operations, and technology research program at analog sites: 1) Authentic work; 2) PI-based projects; 3) Unencumbered baseline studies; 4) Closed simulations; and 5) Observation and documentation. Following these principles, we have been integrating field science, operations research, and technology development at analog sites on Devon Island and in Utah over the past five years. Analytic methods include work practice simulation (Clancey 2002c; Sierhuis et a]., 2000a;b), by which the interaction of human behavior, facilities, geography, tools, and procedures are formalized in computer models. These models are then converted into the runtime EVA system we call mobile agents (Clancey 2002b; Clancey et al., in press). Furthermore, we have found that the Apollo Lunar Surface Journal (Jones, 1999) provides a vast repository or understanding astronaut and CapCom interactions, serving as a baseline for Mars operations and quickly highlighting opportunities for computer automation (Clancey, in press).
Bioinspired Principles for Large-Scale Networked Sensor Systems: An Overview
Jacobsen, Rune Hylsberg; Zhang, Qi; Toftegaard, Thomas Skjødeberg
2011-01-01
Biology has often been used as a source of inspiration in computer science and engineering. Bioinspired principles have found their way into network node design and research due to the appealing analogies between biological systems and large networks of small sensors. This paper provides an overview of bioinspired principles and methods such as swarm intelligence, natural time synchronization, artificial immune system and intercellular information exchange applicable for sensor network design. Bioinspired principles and methods are discussed in the context of routing, clustering, time synchronization, optimal node deployment, localization and security and privacy. PMID:22163841
Computational materials science: Think locally, act globally
NASA Astrophysics Data System (ADS)
Rabe, Karin M.
2002-11-01
New first-principles calculations reveal the range of atomic arrangements underlying the average crystallographic structure of a perovskite oxide, PZT. This work opens the door to understanding the exceptional physical behaviour of PZT and related systems.
2015-04-27
MODELING OF C-S-H Material chemistry level modeling following the principles and techniques commonly grouped under Computational Material Science is...Henmi, C. and Kusachi, I. Monoclinic tobermorite from fuka, bitchu-cho, Okoyama Perfecture. Japan J. Min. Petr. Econ . Geol. (1989)84:374-379. [22...31] Liu, Y. et al. First principles study of the stability and mechanical properties of MC (M=Ti, V, Zr, Nb, Hf and Ta) compounds. Journal of Alloys and Compounds. (2014) 582:500-504. 10
Power Monitoring Using the Raspberry Pi
ERIC Educational Resources Information Center
Snyder, Robin M.
2014-01-01
The Raspberry Pi is a credit card size low powered compute board with Ethernet connection, HDMI video output, audio, full Linux operating system run from an SD card, and more, all for $45. With cables, SD card, etc., the cost is about $70. Originally designed to help teach computer science principles to low income children and students, the Pi has…
ERIC Educational Resources Information Center
Her Many Horses, Ian
2016-01-01
The world, and especially our own country, is in dire need of a larger and more diverse population of computer scientists. While many organizations have approached this problem of too few computer scientists in various ways, a promising, and I believe necessary, path is to expose elementary students to authentic practices of the discipline.…
New Horizons Regional Education Center 1999 FIRST Robotics Competition
NASA Technical Reports Server (NTRS)
Purman, Richard I.
1999-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 1999 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
New Horizons Regional Education Center 2001 FIRST Robotics Competition
NASA Technical Reports Server (NTRS)
2001-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2001 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
FIRST 2002, 2003, 2004 Robotics Competition(s)
NASA Technical Reports Server (NTRS)
Purman, Richard
2004-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2002, 2003, and 2004 FIRST Robotics Competitions. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
First-Principles Study of Superconductivity in Ultra- thin Pb Films
NASA Astrophysics Data System (ADS)
Noffsinger, Jesse; Cohen, Marvin L.
2010-03-01
Recently, superconductivity in ultrathin layered Pb has been confirmed in samples with as few as two atomic layers [S. Qin, J. Kim, Q. Niu, and C.-K. Shih, Science 2009]. Interestingly, the prototypical strong-coupling superconductor exhibits different Tc's for differing surface reconstructions in samples with only two monolayers. Additionally, Tc is seen to oscillate as the number of atomic layers is increased. Using first principles techniques based on Wannier functions, we analyze the electronic structure, lattice dynamics and electron-phonon coupling for varying thicknesses and surface reconstructions of layered Pb. We discuss results as they relate to superconductivity in the bulk, for which accurate calculations of superconducting properties can be compared to experiment [W. L. McMillan and J.M. Rowell, PRL 1965]. This work was supported by National Science Foundation Grant No. DMR07-05941, the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Computational resources have been provided by the Lawrencium computational cluster resource provided by the IT Division at the Lawrence Berkeley National Laboratory (Supported by the Director, Office of Science, Office of Basic Energy Sciences, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231)
Human-computer interaction: psychological aspects of the human use of computing.
Olson, Gary M; Olson, Judith S
2003-01-01
Human-computer interaction (HCI) is a multidisciplinary field in which psychology and other social sciences unite with computer science and related technical fields with the goal of making computing systems that are both useful and usable. It is a blend of applied and basic research, both drawing from psychological research and contributing new ideas to it. New technologies continuously challenge HCI researchers with new options, as do the demands of new audiences and uses. A variety of usability methods have been developed that draw upon psychological principles. HCI research has expanded beyond its roots in the cognitive processes of individual users to include social and organizational processes involved in computer usage in real environments as well as the use of computers in collaboration. HCI researchers need to be mindful of the longer-term changes brought about by the use of computing in a variety of venues.
Spirov, Alexander; Holloway, David
2013-07-15
This paper surveys modeling approaches for studying the evolution of gene regulatory networks (GRNs). Modeling of the design or 'wiring' of GRNs has become increasingly common in developmental and medical biology, as a means of quantifying gene-gene interactions, the response to perturbations, and the overall dynamic motifs of networks. Drawing from developments in GRN 'design' modeling, a number of groups are now using simulations to study how GRNs evolve, both for comparative genomics and to uncover general principles of evolutionary processes. Such work can generally be termed evolution in silico. Complementary to these biologically-focused approaches, a now well-established field of computer science is Evolutionary Computations (ECs), in which highly efficient optimization techniques are inspired from evolutionary principles. In surveying biological simulation approaches, we discuss the considerations that must be taken with respect to: (a) the precision and completeness of the data (e.g. are the simulations for very close matches to anatomical data, or are they for more general exploration of evolutionary principles); (b) the level of detail to model (we proceed from 'coarse-grained' evolution of simple gene-gene interactions to 'fine-grained' evolution at the DNA sequence level); (c) to what degree is it important to include the genome's cellular context; and (d) the efficiency of computation. With respect to the latter, we argue that developments in computer science EC offer the means to perform more complete simulation searches, and will lead to more comprehensive biological predictions. Copyright © 2013 Elsevier Inc. All rights reserved.
Entanglement-Based Machine Learning on a Quantum Computer
NASA Astrophysics Data System (ADS)
Cai, X.-D.; Wu, D.; Su, Z.-E.; Chen, M.-C.; Wang, X.-L.; Li, Li; Liu, N.-L.; Lu, C.-Y.; Pan, J.-W.
2015-03-01
Machine learning, a branch of artificial intelligence, learns from previous experience to optimize performance, which is ubiquitous in various fields such as computer sciences, financial analysis, robotics, and bioinformatics. A challenge is that machine learning with the rapidly growing "big data" could become intractable for classical computers. Recently, quantum machine learning algorithms [Lloyd, Mohseni, and Rebentrost, arXiv.1307.0411] were proposed which could offer an exponential speedup over classical algorithms. Here, we report the first experimental entanglement-based classification of two-, four-, and eight-dimensional vectors to different clusters using a small-scale photonic quantum computer, which are then used to implement supervised and unsupervised machine learning. The results demonstrate the working principle of using quantum computers to manipulate and classify high-dimensional vectors, the core mathematical routine in machine learning. The method can, in principle, be scaled to larger numbers of qubits, and may provide a new route to accelerate machine learning.
ERIC Educational Resources Information Center
Hsu, Chung-Yuan; Tsai, Chin-Chung
2013-01-01
Educational researchers have indicated that although computer games have the potential to promote students' motivation and engagement, the work on how to design effective games that fulfil educational purposes is still in its infancy. This study aimed to examine how integration of self-explanation into a computer game affected primary schoolers'…
Enabling campus grids with open science grid technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weitzel, Derek; Bockelman, Brian; Swanson, David
2011-01-01
The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condormore » clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.« less
Analogy Mapping Development for Learning Programming
NASA Astrophysics Data System (ADS)
Sukamto, R. A.; Prabawa, H. W.; Kurniawati, S.
2017-02-01
Programming skill is an important skill for computer science students, whereas nowadays, there many computer science students are lack of skills and information technology knowledges in Indonesia. This is contrary with the implementation of the ASEAN Economic Community (AEC) since the end of 2015 which is the qualified worker needed. This study provided an effort for nailing programming skills by mapping program code to visual analogies as learning media. The developed media was based on state machine and compiler principle and was implemented in C programming language. The state of every basic condition in programming were successful determined as analogy visualization.
The simplicity principle in perception and cognition.
Feldman, Jacob
2016-09-01
The simplicity principle, traditionally referred to as Occam's razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations- or, more precisely, that it balances a bias toward simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. WIREs Cogn Sci 2016, 7:330-340. doi: 10.1002/wcs.1406 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Optimized Materials From First Principles Simulations: Are We There Yet?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galli, G; Gygi, F
2005-07-26
In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less
Zanin, Massimiliano; Chorbev, Ivan; Stres, Blaz; Stalidzans, Egils; Vera, Julio; Tieri, Paolo; Castiglione, Filippo; Groen, Derek; Zheng, Huiru; Baumbach, Jan; Schmid, Johannes A; Basilio, José; Klimek, Peter; Debeljak, Nataša; Rozman, Damjana; Schmidt, Harald H H W
2017-12-05
Systems medicine holds many promises, but has so far provided only a limited number of proofs of principle. To address this road block, possible barriers and challenges of translating systems medicine into clinical practice need to be identified and addressed. The members of the European Cooperation in Science and Technology (COST) Action CA15120 Open Multiscale Systems Medicine (OpenMultiMed) wish to engage the scientific community of systems medicine and multiscale modelling, data science and computing, to provide their feedback in a structured manner. This will result in follow-up white papers and open access resources to accelerate the clinical translation of systems medicine. © The Author 2017. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Fouda Ndjodo, Marcel; Ngah, Virginie Blanche; Zobo, Erick Patrick
2013-07-01
A competency profile for teachers of Computer Science in Cameroonian secondary education - In 1998, the Cameroonian government decided to introduce Computer Science as a school subject. To implement this decision, it began to train teachers of Computer Science according to the same training model used for teachers of other disciplines. Despite the consensus that seems to be emerging from the scientific community regarding the need to give priority to a cross-disciplinary use of information and communication technologies (ICT) in primary and secondary education, some countries, such as Cameroon, have opted to teach Computer Science. While such a political choice might in principle appear to be inappropriate for the development of students' ICT skills, the article shows that it nevertheless introduces teachers into the system who have a predisposition to act as catalysts for the pedagogical integration of ICT. Such a development could occur provided these teachers are trained in a range of additional skills - those proposed in the article - which would enable them to contribute effectively. If this approach were implemented, sub-Saharan countries such as Cameroon would, in their Computer Science teachers, have access to human resources capable of quickly generalising the cross-disciplinary use of ICT in the education system.
Computation material science of structural-phase transformation in casting aluminium alloys
NASA Astrophysics Data System (ADS)
Golod, V. M.; Dobosh, L. Yu
2017-04-01
Successive stages of computer simulation the formation of the casting microstructure under non-equilibrium conditions of crystallization of multicomponent aluminum alloys are presented. On the basis of computer thermodynamics and heat transfer during solidification of macroscale shaped castings are specified the boundary conditions of local heat exchange at mesoscale modeling of non-equilibrium formation the solid phase and of the component redistribution between phases during coalescence of secondary dendrite branches. Computer analysis of structural - phase transitions based on the principle of additive physico-chemical effect of the alloy components in the process of diffusional - capillary morphological evolution of the dendrite structure and the o of local dendrite heterogeneity which stochastic nature and extent are revealed under metallographic study and modeling by the Monte Carlo method. The integrated computational materials science tools at researches of alloys are focused and implemented on analysis the multiple-factor system of casting processes and prediction of casting microstructure.
The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking
NASA Astrophysics Data System (ADS)
Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng
Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).
First principles statistical mechanics of alloys and magnetism
NASA Astrophysics Data System (ADS)
Eisenbach, Markus; Khan, Suffian N.; Li, Ying Wai
Modern high performance computing resources are enabling the exploration of the statistical physics of phase spaces with increasing size and higher fidelity of the Hamiltonian of the systems. For selected systems, this now allows the combination of Density Functional based first principles calculations with classical Monte Carlo methods for parameter free, predictive thermodynamics of materials. We combine our locally selfconsistent real space multiple scattering method for solving the Kohn-Sham equation with Wang-Landau Monte-Carlo calculations (WL-LSMS). In the past we have applied this method to the calculation of Curie temperatures in magnetic materials. Here we will present direct calculations of the chemical order - disorder transitions in alloys. We present our calculated transition temperature for the chemical ordering in CuZn and the temperature dependence of the short-range order parameter and specific heat. Finally we will present the extension of the WL-LSMS method to magnetic alloys, thus allowing the investigation of the interplay of magnetism, structure and chemical order in ferrous alloys. This research was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Science and Engineering Division and it used Oak Ridge Leadership Computing Facility resources at Oak Ridge National Laboratory.
Diffusion of Innovations: Smartphones and Wireless Anatomy Learning Resources
ERIC Educational Resources Information Center
Trelease, Robert B.
2008-01-01
The author has previously reported on principles of diffusion of innovations, the processes by which new technologies become popularly adopted, specifically in relation to anatomy and education. In presentations on adopting handheld computers [personal digital assistants (PDAs)] and personal media players for health sciences education, particular…
ERIC Educational Resources Information Center
Leyden, Michael B.
1994-01-01
Discusses the properties of neodymium magnets and magnets in general and how magnets can be used to teach students important scientific principles, such as attraction, repulsion, and polarity; the role of magnetic forces in electronic communications and computers; the magnetic properties of the earth and compasses; and the relationship between…
First-principles data-driven discovery of transition metal oxides for artificial photosynthesis
NASA Astrophysics Data System (ADS)
Yan, Qimin
We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.
NASA Astrophysics Data System (ADS)
Chang, Li-Na; Luo, Shun-Long; Sun, Yuan
2017-11-01
The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182
The Application of Peer Teaching in Digital Forensics Education
ERIC Educational Resources Information Center
Govan, Michelle
2016-01-01
The field of digital forensics requires a multidisciplinary understanding of a range of diverse subjects, but is interdisciplinary (in using principles, techniques and theories from other disciplines) encompassing both computer and forensic science. This requires that practitioners have a deep technical knowledge and understanding, but that they…
Preparing to Teach in Cyberspace: User Education in Real and Virtual Libraries.
ERIC Educational Resources Information Center
Byron, Suzanne
1995-01-01
Discussion of librarians' training for teaching user education focuses on experiments at the University of North Texas in providing resources and empowering education for librarians and staff members who teach. The use of computer-based education principles and Ranganathan's laws of library science are explained. (Author/LRW)
The information science of microbial ecology.
Hahn, Aria S; Konwar, Kishori M; Louca, Stilianos; Hanson, Niels W; Hallam, Steven J
2016-06-01
A revolution is unfolding in microbial ecology where petabytes of 'multi-omics' data are produced using next generation sequencing and mass spectrometry platforms. This cornucopia of biological information has enormous potential to reveal the hidden metabolic powers of microbial communities in natural and engineered ecosystems. However, to realize this potential, the development of new technologies and interpretative frameworks grounded in ecological design principles are needed to overcome computational and analytical bottlenecks. Here we explore the relationship between microbial ecology and information science in the era of cloud-based computation. We consider microorganisms as individual information processing units implementing a distributed metabolic algorithm and describe developments in ecoinformatics and ubiquitous computing with the potential to eliminate bottlenecks and empower knowledge creation and translation. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics
Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis
2014-07-28
The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less
Discovering Psychological Principles by Mining Naturally Occurring Data Sets.
Goldstone, Robert L; Lupyan, Gary
2016-07-01
The very expertise with which psychologists wield their tools for achieving laboratory control may have had the unwelcome effect of blinding psychologists to the possibilities of discovering principles of behavior without conducting experiments. When creatively interrogated, a diverse range of large, real-world data sets provides powerful diagnostic tools for revealing principles of human judgment, perception, categorization, decision-making, language use, inference, problem solving, and representation. Examples of these data sets include patterns of website links, dictionaries, logs of group interactions, collections of images and image tags, text corpora, history of financial transactions, trends in twitter tag usage and propagation, patents, consumer product sales, performance in high-stakes sporting events, dialect maps, and scientific citations. The goal of this issue is to present some exemplary case studies of mining naturally existing data sets to reveal important principles and phenomena in cognitive science, and to discuss some of the underlying issues involved with conducting traditional experiments, analyses of naturally occurring data, computational modeling, and the synthesis of all three methods. Copyright © 2016 Cognitive Science Society, Inc.
Enhancing implementation science by applying best principles of systems science.
Northridge, Mary E; Metcalf, Sara S
2016-10-04
Implementation science holds promise for better ensuring that research is translated into evidence-based policy and practice, but interventions often fail or even worsen the problems they are intended to solve due to a lack of understanding of real world structures and dynamic complexity. While systems science alone cannot possibly solve the major challenges in public health, systems-based approaches may contribute to changing the language and methods for conceptualising and acting within complex systems. The overarching goal of this paper is to improve the modelling used in dissemination and implementation research by applying best principles of systems science. Best principles, as distinct from the more customary term 'best practices', are used to underscore the need to extract the core issues from the context in which they are embedded in order to better ensure that they are transferable across settings. Toward meaningfully grappling with the complex and challenging problems faced in adopting and integrating evidence-based health interventions and changing practice patterns within specific settings, we propose and illustrate four best principles derived from our systems science experience: (1) model the problem, not the system; (2) pay attention to what is important, not just what is quantifiable; (3) leverage the utility of models as boundary objects; and (4) adopt a portfolio approach to model building. To improve our mental models of the real world, system scientists have created methodologies such as system dynamics, agent-based modelling, geographic information science and social network simulation. To understand dynamic complexity, we need the ability to simulate. Otherwise, our understanding will be limited. The practice of dynamic systems modelling, as discussed herein, is the art and science of linking system structure to behaviour for the purpose of changing structure to improve behaviour. A useful computer model creates a knowledge repository and a virtual library for internally consistent exploration of alternative assumptions. Among the benefits of systems modelling are iterative practice, participatory potential and possibility thinking. We trust that the best principles proposed here will resonate with implementation scientists; applying them to the modelling process may abet the translation of research into effective policy and practice.
Statistical mechanical theory for steady state systems. VI. Variational principles
NASA Astrophysics Data System (ADS)
Attard, Phil
2006-12-01
Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.
Brunk, Elizabeth; Ashari, Negar; Athri, Prashanth; Campomanes, Pablo; de Carvalho, F Franco; Curchod, Basile F E; Diamantis, Polydefkis; Doemer, Manuel; Garrec, Julian; Laktionov, Andrey; Micciarelli, Marco; Neri, Marilisa; Palermo, Giulia; Penfold, Thomas J; Vanni, Stefano; Tavernelli, Ivano; Rothlisberger, Ursula
2011-01-01
The Laboratory of Computational Chemistry and Biochemistry is active in the development and application of first-principles based simulations of complex chemical and biochemical phenomena. Here, we review some of our recent efforts in extending these methods to larger systems, longer time scales and increased accuracies. Their versatility is illustrated with a diverse range of applications, ranging from the determination of the gas phase structure of the cyclic decapeptide gramicidin S, to the study of G protein coupled receptors, the interaction of transition metal based anti-cancer agents with protein targets, the mechanism of action of DNA repair enzymes, the role of metal ions in neurodegenerative diseases and the computational design of dye-sensitized solar cells. Many of these projects are done in collaboration with experimental groups from the Institute of Chemical Sciences and Engineering (ISIC) at the EPFL.
Strategies for a Creative Future with Computer Science, Quality Design and Communicability
NASA Astrophysics Data System (ADS)
Cipolla Ficarra, Francisco V.; Villarreal, Maria
In the current work is presented the importance of the two-way triad between computer science, design and communicability. It is demonstrated how the principles of quality of software engineering are not universal since they are disappearing inside university training. Besides, a short analysis of the term "creativity" males apparent the existence of plagiarism as a human factor that damages the future of communicability applied to the on-line and off-line contents of the open software. A set of measures and guidelines are presented so that the triad works again correctly in the next years to foster the qualitative design of the interactive systems on-line and/or off-line.
Active Teaching of Diffusion through History of Science, Computer Animation and Role Playing
ERIC Educational Resources Information Center
Krajsek, Simona Strgulc; Vilhar, Barbara
2010-01-01
We developed and tested a lesson plan for active teaching of diffusion in secondary schools (grades 10-13), which stimulates understanding of the thermal (Brownian) motion of particles as the principle underlying diffusion. During the lesson, students actively explore the Brownian motion through microscope observations of irregularly moving small…
Operation ARA: A Computerized Learning Game that Teaches Critical Thinking and Scientific Reasoning
ERIC Educational Resources Information Center
Halpern, Diane F.; Millis, Keith; Graesser, Arthur C.; Butler, Heather; Forsyth, Carol; Cai, Zhiqiang
2012-01-01
Operation ARA (Acquiring Research Acumen) is a computerized learning game that teaches critical thinking and scientific reasoning. It is a valuable learning tool that utilizes principles from the science of learning and serious computer games. Students learn the skills of scientific reasoning by engaging in interactive dialogs with avatars. They…
Object-Oriented Programming in High Schools the Turing Way.
ERIC Educational Resources Information Center
Holt, Richard C.
This paper proposes an approach to introducing object-oriented concepts to high school computer science students using the Object-Oriented Turing (OOT) language. Students can learn about basic object-oriented (OO) principles such as classes and inheritance by using and expanding a collection of classes that draw pictures like circles and happy…
Contributions of Science Principles to Teaching: How Science Principles Can Be Used
ERIC Educational Resources Information Center
Henson, Kenneth T.
1974-01-01
Describes the steps involved in using the "principles" approach in teaching science, illustrates the process of using science principles with an example relating to rock formation, and discusses the relevance of this approach to contemporary trends in science teaching. (JR)
Fermilab computing at the Intensity Frontier
Group, Craig; Fuess, S.; Gutsche, O.; ...
2015-12-23
The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less
Experimental quantum computing to solve systems of linear equations.
Cai, X-D; Weedbrook, C; Su, Z-E; Chen, M-C; Gu, Mile; Zhu, M-J; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei
2013-06-07
Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such a task can be intractable for classical computers, as the best known classical algorithms require a time proportional to the number of variables N. A recently proposed quantum algorithm shows that quantum computers could solve linear systems in a time scale of order log(N), giving an exponential speedup over classical computers. Here we realize the simplest instance of this algorithm, solving 2×2 linear equations for various input vectors on a quantum computer. We use four quantum bits and four controlled logic gates to implement every subroutine required, demonstrating the working principle of this algorithm.
Efficient and Effective Change Principles in Active Videogames
Fenner, Ashley A.; Howie, Erin K.; Feltz, Deborah L.; Gray, Cindy M.; Lu, Amy Shirong; Mueller, Florian “Floyd”; Simons, Monique; Barnett, Lisa M.
2015-01-01
Abstract Active videogames have the potential to enhance population levels of physical activity but have not been successful in achieving this aim to date. This article considers a range of principles that may be important to the design of effective and efficient active videogames from diverse discipline areas, including behavioral sciences (health behavior change, motor learning, and serious games), business production (marketing and sales), and technology engineering and design (human–computer interaction/ergonomics and flow). Both direct and indirect pathways to impact on population levels of habitual physical activity are proposed, along with the concept of a game use lifecycle. Examples of current active and sedentary electronic games are used to understand how such principles may be applied. Furthermore, limitations of the current usage of theoretical principles are discussed. A suggested list of principles for best practice in active videogame design is proposed along with suggested research ideas to inform practice to enhance physical activity. PMID:26181680
Efficient and Effective Change Principles in Active Videogames.
Straker, Leon M; Fenner, Ashley A; Howie, Erin K; Feltz, Deborah L; Gray, Cindy M; Lu, Amy Shirong; Mueller, Florian Floyd; Simons, Monique; Barnett, Lisa M
2015-02-01
Active videogames have the potential to enhance population levels of physical activity but have not been successful in achieving this aim to date. This article considers a range of principles that may be important to the design of effective and efficient active videogames from diverse discipline areas, including behavioral sciences (health behavior change, motor learning, and serious games), business production (marketing and sales), and technology engineering and design (human-computer interaction/ergonomics and flow). Both direct and indirect pathways to impact on population levels of habitual physical activity are proposed, along with the concept of a game use lifecycle. Examples of current active and sedentary electronic games are used to understand how such principles may be applied. Furthermore, limitations of the current usage of theoretical principles are discussed. A suggested list of principles for best practice in active videogame design is proposed along with suggested research ideas to inform practice to enhance physical activity.
Cellular automaton supercomputing
NASA Technical Reports Server (NTRS)
Wolfram, Stephen
1987-01-01
Many of the models now used in science and engineering are over a century old. And most of them can be implemented on modern digital computers only with considerable difficulty. Some new basic models are discussed which are much more directly suitable for digital computer simulation. The fundamental principle is that the models considered herein are as suitable as possible for implementation on digital computers. It is then a matter of scientific analysis to determine whether such models can reproduce the behavior seen in physical and other systems. Such analysis was carried out in several cases, and the results are very encouraging.
Data stewardship - a fundamental part of the scientific method (Invited)
NASA Astrophysics Data System (ADS)
Foster, C.; Ross, J.; Wyborn, L. A.
2013-12-01
This paper emphasises the importance of data stewardship as a fundamental part of the scientific method, and the need to effect cultural change to ensure engagement by earth scientists. It is differentiated from the science of data stewardship per se. Earth System science generates vast quantities of data, and in the past, data analysis has been constrained by compute power, such that sub-sampling of data often provided the only way to reach an outcome. This is analogous to Kahneman's System 1 heuristic, with its simplistic and often erroneous outcomes. The development of HPC has liberated earth sciences such that the complexity and heterogeneity of natural systems can be utilised in modelling at any scale, global, or regional, or local; for example, movement of crustal fluids. Paradoxically, now that compute power is available, it is the stewardship of the data that is presenting the main challenges. There is a wide spectrum of issues: from effectively handling and accessing acquired data volumes [e.g. satellite feeds per day/hour]; through agreed taxonomy to effect machine to machine analyses; to idiosyncratic approaches by individual scientists. Except for the latter, most agree that data stewardship is essential. Indeed it is an essential part of the science workflow. As science struggles to engage and inform on issues of community importance, such as shale gas and fraccing, all parties must have equal access to data used for decision making; without that, there will be no social licence to operate or indeed access to additional science funding (Heidorn, 2008). The stewardship of scientific data is an essential part of the science process; but often it is regarded, wrongly, as entirely in the domain of data custodians or stewards. Geoscience Australia has developed a set of six principles that apply to all science activities within the agency: Relevance to Government Collaborative science Quality science Transparent science Communicated science Sustained science capability Every principle includes data stewardship: this is to effect cultural change at both collective and individual levels to ensure that our science outcomes and technical advice are effective for the Government and community.
First-principles calculations of structure and elasticity of hydrous fayalite under high pressure
NASA Astrophysics Data System (ADS)
Zhang, Chuan-Yu; Wang, Xu-Ben; Zhao, Xiao-Feng; Chen, Xing-Run; Yu, You; Tian, Xiao-Feng
2017-12-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11404042 and 11604029), the Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20135122120010), and the Open Research Fund of Computational Physics Key Laboratory of Sichuan Province, Yibin University (Grant No. JSWL2015KFZ02).
1988-06-27
de olf nessse end Id e ;-tl Sb ieeI smleo) ,Optical Artificial Intellegence ; Optical inference engines; Optical logic; Optical informationprocessing...common. They arise in areas such as expert systems and other artificial intelligence systems. In recent years, the computer science language PROLOG has...cal processors should in principle be well suited for : I artificial intelligence applications. In recent years, symbolic logic processing. , the
Games, Simulations, and Visual Metaphors in Education: Antagonism between Enjoyment and Learning
ERIC Educational Resources Information Center
Rieber, Lloyd P.; Noah, David
2008-01-01
The purpose of this study was to investigate the influence of game-like activities on adult learning during a computer-based simulation. This research also studied the use of visual metaphors as graphic organizers to help make the underlying science principles explicit without interfering with the interactive nature of the simulation. A total of…
Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level
NASA Astrophysics Data System (ADS)
Christiansen, Henning
2004-09-01
Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural science or humanities. It has been developed for a course that integrates theoretical material on computer languages and abstract machines with practical programming techniques. Prolog used as meta-language for describing language issues is the central instrument in the approach: Formal descriptions become running prototypes that are easy and appealing to test and modify, and can be extended into analyzers, interpreters, and tools such as tracers and debuggers. Experience shows a high learning curve, especially when the principles are extended into a learning-by-doing approach having the students to develop such descriptions themselves from an informal introduction.
Magneto Caloric Effect in Ni-Mn-Ga alloys: First Principles and Experimental studies
NASA Astrophysics Data System (ADS)
Odbadrakh, Khorgolkhuu; Nicholson, Don; Brown, Gregory; Rusanu, Aurelian; Rios, Orlando; Hodges, Jason; Safa-Sefat, Athena; Ludtka, Gerard; Eisenbach, Markus; Evans, Boyd
2012-02-01
Understanding the Magneto-Caloric Effect (MCE) in alloys with real technological potential is important to the development of viable MCE based products. We report results of computational and experimental investigation of a candidate MCE materials Ni-Mn-Ga alloys. The Wang-Landau statistical method is used in tandem with Locally Self-consistent Multiple Scattering (LSMS) method to explore magnetic states of the system. A classical Heisenberg Hamiltonian is parametrized based on these states and used in obtaining the density of magnetic states. The Currie temperature, isothermal entropy change, and adiabatic temperature change are then calculated from the density of states. Experiments to observe the structural and magnetic phase transformations were performed at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) on alloys of Ni-Mn-Ga and Fe-Ni-Mn-Ga-Cu. Data from the observations are discussed in comparison with the computational studies. This work was sponsored by the Laboratory Directed Research and Development Program (ORNL), by the Mathematical, Information, and Computational Sciences Division; Office of Advanced Scientific Computing Research (US DOE), and by the Materials Sciences and Engineering Division; Office of Basic Energy Sciences (US DOE).
DOE Office of Scientific and Technical Information (OSTI.GOV)
von Lilienfeld-Toal, Otto Anatole
2010-11-01
The design of new materials with specific physical, chemical, or biological properties is a central goal of much research in materials and medicinal sciences. Except for the simplest and most restricted cases brute-force computational screening of all possible compounds for interesting properties is beyond any current capacity due to the combinatorial nature of chemical compound space (set of stoichiometries and configurations). Consequently, when it comes to computationally optimizing more complex systems, reliable optimization algorithms must not only trade-off sufficient accuracy and computational speed of the models involved, they must also aim for rapid convergence in terms of number of compoundsmore » 'visited'. I will give an overview on recent progress on alchemical first principles paths and gradients in compound space that appear to be promising ingredients for more efficient property optimizations. Specifically, based on molecular grand canonical density functional theory an approach will be presented for the construction of high-dimensional yet analytical property gradients in chemical compound space. Thereafter, applications to molecular HOMO eigenvalues, catalyst design, and other problems and systems shall be discussed.« less
Computational materials design of crystalline solids.
Butler, Keith T; Frost, Jarvist M; Skelton, Jonathan M; Svane, Katrine L; Walsh, Aron
2016-11-07
The modelling of materials properties and processes from first principles is becoming sufficiently accurate as to facilitate the design and testing of new systems in silico. Computational materials science is both valuable and increasingly necessary for developing novel functional materials and composites that meet the requirements of next-generation technology. A range of simulation techniques are being developed and applied to problems related to materials for energy generation, storage and conversion including solar cells, nuclear reactors, batteries, fuel cells, and catalytic systems. Such techniques may combine crystal-structure prediction (global optimisation), data mining (materials informatics) and high-throughput screening with elements of machine learning. We explore the development process associated with computational materials design, from setting the requirements and descriptors to the development and testing of new materials. As a case study, we critically review progress in the fields of thermoelectrics and photovoltaics, including the simulation of lattice thermal conductivity and the search for Pb-free hybrid halide perovskites. Finally, a number of universal chemical-design principles are advanced.
NASA Astrophysics Data System (ADS)
Huppert, J.; Michal Lomask, S.; Lazarowitz, R.
2002-08-01
Computer-assisted learning, including simulated experiments, has great potential to address the problem solving process which is a complex activity. It requires a highly structured approach in order to understand the use of simulations as an instructional device. This study is based on a computer simulation program, 'The Growth Curve of Microorganisms', which required tenth grade biology students to use problem solving skills whilst simultaneously manipulating three independent variables in one simulated experiment. The aims were to investigate the computer simulation's impact on students' academic achievement and on their mastery of science process skills in relation to their cognitive stages. The results indicate that the concrete and transition operational students in the experimental group achieved significantly higher academic achievement than their counterparts in the control group. The higher the cognitive operational stage, the higher students' achievement was, except in the control group where students in the concrete and transition operational stages did not differ. Girls achieved equally with the boys in the experimental group. Students' academic achievement may indicate the potential impact a computer simulation program can have, enabling students with low reasoning abilities to cope successfully with learning concepts and principles in science which require high cognitive skills.
The FuturICT education accelerator
NASA Astrophysics Data System (ADS)
Johnson, J.; Buckingham Shum, S.; Willis, A.; Bishop, S.; Zamenopoulos, T.; Swithenby, S.; MacKay, R.; Merali, Y.; Lorincz, A.; Costea, C.; Bourgine, P.; Louçã, J.; Kapenieks, A.; Kelley, P.; Caird, S.; Bromley, J.; Deakin Crick, R.; Goldspink, C.; Collet, P.; Carbone, A.; Helbing, D.
2012-11-01
Education is a major force for economic and social wellbeing. Despite high aspirations, education at all levels can be expensive and ineffective. Three Grand Challenges are identified: (1) enable people to learn orders of magnitude more effectively, (2) enable people to learn at orders of magnitude less cost, and (3) demonstrate success by exemplary interdisciplinary education in complex systems science. A ten year `man-on-the-moon' project is proposed in which FuturICT's unique combination of Complexity, Social and Computing Sciences could provide an urgently needed transdisciplinary language for making sense of educational systems. In close dialogue with educational theory and practice, and grounded in the emerging data science and learning analytics paradigms, this will translate into practical tools (both analytical and computational) for researchers, practitioners and leaders; generative principles for resilient educational ecosystems; and innovation for radically scalable, yet personalised, learner engagement and assessment. The proposed Education Accelerator will serve as a `wind tunnel' for testing these ideas in the context of real educational programmes, with an international virtual campus delivering complex systems education exploiting the new understanding of complex, social, computationally enhanced organisational structure developed within FuturICT.
Foreign Military Sales Pricing Principles for Electronic Technical Manuals
2004-06-01
companies provide benefits such as flexible hours, flexible days, and telecommuting . This information is useful because facilities costs and overhead can...personnel are listed below: Occupation Title Employment (1) Median Hourly Mean Hourly Mean Annual (2) Computer and Mathematical Science...be minimized or significantly reduced for companies providing this benefit . There was one disturbing statistic from this survey. Despite the
Formal logic rewrite system bachelor in teaching mathematical informatics
NASA Astrophysics Data System (ADS)
Habiballa, Hashim; Jendryscik, Radek
2017-07-01
The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.
Underlying Principles of Natural Selection in Network Evolution: Systems Biology Approach
Chen, Bor-Sen; Wu, Wei-Sheng
2007-01-01
Systems biology is a rapidly expanding field that integrates diverse areas of science such as physics, engineering, computer science, mathematics, and biology toward the goal of elucidating the underlying principles of hierarchical metabolic and regulatory systems in the cell, and ultimately leading to predictive understanding of cellular response to perturbations. Because post-genomics research is taking place throughout the tree of life, comparative approaches offer a way for combining data from many organisms to shed light on the evolution and function of biological networks from the gene to the organismal level. Therefore, systems biology can build on decades of theoretical work in evolutionary biology, and at the same time evolutionary biology can use the systems biology approach to go in new uncharted directions. In this study, we present a review of how the post-genomics era is adopting comparative approaches and dynamic system methods to understand the underlying design principles of network evolution and to shape the nascent field of evolutionary systems biology. Finally, the application of evolutionary systems biology to robust biological network designs is also discussed from the synthetic biology perspective. PMID:19468310
The potential impact of microgravity science and technology on education
NASA Technical Reports Server (NTRS)
Wargo, M. J.
1992-01-01
The development of educational support materials by NASA's Microgravity Science and Applications Division is discussed in the light of two programs. Descriptions of the inception and application possibilities are given for the Microgravity-Science Teacher's Guide and the program of Undergraduate Research Opportunities in Microgravity Science and Technology. The guide is intended to introduce students to the principles and research efforts related to microgravity, and the undergraduate program is intended to reinforce interest in the space program. The use of computers and electronic communications is shown to be an important catalyst for the educational efforts. It is suggested that student and teacher access to these programs be enhanced so that they can have a broader impact on the educational development of space-related knowledge.
Laboratory Directed Research and Development FY 1998 Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Vigil; Kyle Wheeler
This is the FY 1998 Progress Report for the Laboratory Directed Research and Development (LDRD) Program at Los Alamos National Laboratory. It gives an overview of the LDRD Program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principle investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic, molecular, optical, and plasma physics, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less
Single Cell Genomics: Approaches and Utility in Immunology
Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A
2017-01-01
Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102
NASA Astrophysics Data System (ADS)
Wang, Su; Liu, Xiufeng; Zhao, Yandong
2012-09-01
As the breadth and depth of economic reforms increase in China, growing attention is being paid to equalities in opportunities to learn science by students of various backgrounds. In early 2009, the Chinese Ministry of Education and Ministry of Science and Technology jointly sponsored a national survey of urban eighth-grade students' science literacy along with their family and school backgrounds. The present study focused on students' understanding of basic science concepts and principles (BSCP), a subset of science literacy. The sample analyzed included 3,031 students from 109 randomly selected classes/schools. Correlation analysis, one-way analysis of variance, and two-level linear regression were conducted. The results showed that having a refrigerator, internet, more books, parents purchasing books and magazines related to school work, higher father's education level, and parents' higher expectation of the education level of their child significantly predicted higher BSCP scores; having siblings at home, owning an apartment, and frequently contacting teachers about the child significantly predicted lower BSCP scores. At the school level, the results showed that being in the first-tier or key schools, having school libraries, science popularization galleries, computer labs, adequate equipment for teaching, special budget for teacher training, special budget for science equipment, and mutual trust between teachers and students significantly predicated higher BSCP scores; and having science and technology rooms, offering science and technology interest clubs, special budget for science curriculum development, and special budget for science social practice activities significantly predicted lower BSCP scores. The implications of the above findings are discussed.
Effective approach to spectroscopy and spectral analysis techniques using Matlab
NASA Astrophysics Data System (ADS)
Li, Xiang; Lv, Yong
2017-08-01
With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching
Semiconductor-inspired design principles for superconducting quantum computing.
Shim, Yun-Pil; Tahan, Charles
2016-03-17
Superconducting circuits offer tremendous design flexibility in the quantum regime culminating most recently in the demonstration of few qubit systems supposedly approaching the threshold for fault-tolerant quantum information processing. Competition in the solid-state comes from semiconductor qubits, where nature has bestowed some very useful properties which can be utilized for spin qubit-based quantum computing. Here we begin to explore how selective design principles deduced from spin-based systems could be used to advance superconducting qubit science. We take an initial step along this path proposing an encoded qubit approach realizable with state-of-the-art tunable Josephson junction qubits. Our results show that this design philosophy holds promise, enables microwave-free control, and offers a pathway to future qubit designs with new capabilities such as with higher fidelity or, perhaps, operation at higher temperature. The approach is also especially suited to qubits on the basis of variable super-semi junctions.
Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing
NASA Astrophysics Data System (ADS)
Krajíček, Jiří
This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].
Understanding nature of science as progressive transitions in heuristic principles
NASA Astrophysics Data System (ADS)
Niaz, Mansoor
2001-11-01
This study has the following objectives: (a) understand nature of science as progressive transitions in heuristic principles as conceptualized by Schwab (1962); (b) reformulate Smith and Scharmann's characterization of nature of science (Smith & Scharmann, 1999) in the light of evidence from history and philosophy of science; and (c) provide a rationale for the inclusion of three more characteristics of nature of science, to the original five suggested by Smith and Scharmann. It is concluded that nature of science manifests in the different topics of the science curriculum as heuristic principles. Science education, by emphasizing not only the empirical nature of science but also the underlying heuristic principles, can facilitate conceptual understanding.
Computer vision and augmented reality in gastrointestinal endoscopy
Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M.
2015-01-01
Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy—which relies on the integration of high-definition video data with pathologic correlates—requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175
1993 Annual report on scientific programs: A broad research program on the sciences of complexity
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-12-31
This report provides a summary of many of the research projects completed by the Santa Fe Institute (SFI) during 1993. These research efforts continue to focus on two general areas: the study of, and search for, underlying scientific principles governing complex adaptive systems, and the exploration of new theories of computation that incorporate natural mechanisms of adaptation (mutation, genetics, evolution).
Triangle Computer Science Distinguished Lecture Series
2018-01-30
scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for studying them. Human...the great objects of scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for...in principle , secure system operation can be achieved. Massive-Scale Streaming Analytics David Bader, Georgia Institute of Technology (telecast from
Signal Detection Analysis of Computer Enhanced Group Decision Making Strategies
2007-11-01
group decision making. 20 References American Psychological Association (2002). Ethical principles of psychologists and code of conduct. American... Creelman , C. D. (2005). Detection theory: A user’s guide (2nd ed.). Mahwah, NJ: Lawrence Erlbaum. Sorkin, R. D. (1998). Group performance depends on...the majority rule. Psychological Science, 9, 456-463. Sorkin, R. D. (2001). Signal-detection analysis of group decision making. Psychological
Imagining tomorrow's university in an era of open science.
Howe, Adina; Howe, Michael; Kaleita, Amy L; Raman, D Raj
2017-01-01
As part of a recent workshop entitled "Imagining Tomorrow's University", we were asked to visualize the future of universities as research becomes increasingly data- and computation-driven, and identify a set of principles characterizing pertinent opportunities and obstacles presented by this shift. In order to establish a holistic view, we take a multilevel approach and examine the impact of open science on individual scholars and how this impacts as well as on the university as a whole. At the university level, open science presents a double-edged sword: when well executed, open science can accelerate the rate of scientific inquiry across the institution and beyond; however, haphazard or half-hearted efforts are likely to squander valuable resources, diminish university productivity and prestige, and potentially do more harm than good. We present our perspective on the role of open science at the university.
The Principles for Successful Scientific Data Management Revisited
NASA Astrophysics Data System (ADS)
Walker, R. J.; King, T. A.; Joy, S. P.
2005-12-01
It has been 23 years since the National Research Council's Committee on Data Management and Computation (CODMAC) published its famous list of principles for successful scientific data management that have provided the framework for modern space science data management. CODMAC outlined seven principles: 1. Scientific Involvement in all aspects of space science missions. 2. Scientific Oversight of all scientific data-management activities. 3. Data Availability - Validated data should be made available to the scientific community in a timely manner. They should include appropriate ancillary data, and complete documentation. 4. Facilities - A proper balance between cost and scientific productivity should be maintained. 5. Software - Transportable well documented software should be available to process and analyze the data. 6. Scientific Data Storage - The data should be preserved in retrievable form. 7. Data System Funding - Adequate data funding should be made available at the outset of missions and protected from overruns. In this paper we will review the lessons learned in trying to apply these principles to space derived data. The Planetary Data System created the concept of data curation to carry out the CODMAC principles. Data curators are scientists and technologists who work directly with the mission scientists to create data products. The efficient application of the CODMAC principles requires that data curators and the mission team start early in a mission to plan for data access and archiving. To build the data products the planetary discipline adopted data access and documentation standards and has adhered to them. The data curators and mission team work together to produce data products and make them available. However even with early planning and agreement on standards the needs of the science community frequently far exceed the available resources. This is especially true for smaller principal investigator run missions. We will argue that one way to make data systems for small missions more effective is for the data curators to provide software tools to help develop the mission data system.
Single-Cell Genomics: Approaches and Utility in Immunology.
Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A
2017-02-01
Single-cell genomics offers powerful tools for studying immune cells, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population level. Advances in computer science and single-cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single-cell RNA-sequencing data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fundamentals and Recent Developments in Approximate Bayesian Computation
Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka
2017-01-01
Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922
Cellular intelligence: Microphenomenology and the realities of being.
Ford, Brian J
2017-12-01
Traditions of Eastern thought conceptualised life in a holistic sense, emphasising the processes of maintaining health and conquering sickness as manifestations of an essentially spiritual principle that was of overriding importance in the conduct of living. Western science, which drove the overriding and partial eclipse of Eastern traditions, became founded on a reductionist quest for ultimate realities which, in the modern scientific world, has embraced the notion that every living process can be successfully modelled by a digital computer system. It is argued here that the essential processes of cognition, response and decision-making inherent in living cells transcend conventional modelling, and microscopic studies of organisms like the shell-building amoebae and the rhodophyte alga Antithamnion reveal a level of cellular intelligence that is unrecognized by science and is not amenable to computer analysis. Copyright © 2017. Published by Elsevier Ltd.
P3: a practice focused learning environment
NASA Astrophysics Data System (ADS)
Irving, Paul W.; Obsniuk, Michael J.; Caballero, Marcos D.
2017-09-01
There has been an increased focus on the integration of practices into physics curricula, with a particular emphasis on integrating computation into the undergraduate curriculum of scientists and engineers. In this paper, we present a university-level, introductory physics course for science and engineering majors at Michigan State University called P3 (projects and practices in physics) that is centred around providing introductory physics students with the opportunity to appropriate various science and engineering practices. The P3 design integrates computation with analytical problem solving and is built upon a curriculum foundation of problem-based learning, the principles of constructive alignment and the theoretical framework of community of practice. The design includes an innovative approach to computational physics instruction, instructional scaffolds, and a unique approach to assessment that enables instructors to guide students in the development of the practices of a physicist. We present the very positive student related outcomes of the design gathered via attitudinal and conceptual inventories and research interviews of students’ reflecting on their experiences in the P3 classroom.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
1992-08-10
scheduler’s strategy. At the beginning of the is delayed at this state. A routing edge (Si,Tj), execution, the scheduler delays all messages sent for...graph G and Appendix). We remark that the choice of "play" the role ý-’ the scheduler . The scheduler’s constants in the above definition is not...34’ Annmul Symposium ical wakeup . Also, assume that during [t, t + x], for on Foundations of Computer Science. IEEE, t > 2n, a specific route of length at most
CDAC Student Report: Summary of LLNL Internship
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herriman, Jane E.
Multiple objectives motivated me to apply for an internship at LLNL: I wanted to experience the work environment at a national lab, to learn about research and job opportunities at LLNL in particular, and to gain greater experience with code development, particularly within the realm of high performance computing (HPC). This summer I was selected to participate in LLNL's Computational Chemistry and Material Science Summer Institute (CCMS). CCMS is a 10 week program hosted by the Quantum Simulations group leader, Dr. Eric Schwegler. CCMS connects graduate students to mentors at LLNL involved in similar re- search and provides weekly seminarsmore » on a broad array of topics from within chemistry and materials science. Dr. Xavier Andrade and Dr. Erik Draeger served as my co-mentors over the summer, and Dr. Andrade continues to mentor me now that CCMS has concluded. Dr. Andrade is a member of the Quantum Simulations group within the Physical and Life Sciences at LLNL, and Dr. Draeger leads the HPC group within the Center for Applied Scientific Computing (CASC). The two have worked together to develop Qb@ll, an open-source first principles molecular dynamics code that was the platform for my summer research project.« less
Principles of Food Science Class Sheds Light on Chemistry
ERIC Educational Resources Information Center
Ward, Janet
2004-01-01
Many students are curious about the steps in food preparation. As a result of such experiences, the author of this article began to incorporate science demonstrations into food preparation classes. She conducted research, developed resources, and piloted the "Principles of Food Science" class over the next 6 years. "Principles of Food Science"…
An Ada Object Oriented Missile Flight Simulation
1991-09-01
identify by block number) This thesis uses the Ada programming language in the design and development of an air-to-air missile flight simulation with...object oriented techniques and sound software engineering principles. The simulation is designed to be more understandable, modifiable, efficient and...Department of Computer Science ii ABSTRACT This thesis uses the Ada programming language in the design and development of an air-to-air missile flight
Procedural Quantum Programming
NASA Astrophysics Data System (ADS)
Ömer, Bernhard
2002-09-01
While classical computing science has developed a variety of methods and programming languages around the concept of the universal computer, the typical description of quantum algorithms still uses a purely mathematical, non-constructive formalism which makes no difference between a hydrogen atom and a quantum computer. This paper investigates, how the concept of procedural programming languages, the most widely used classical formalism for describing and implementing algorithms, can be adopted to the field of quantum computing, and how non-classical features like the reversibility of unitary transformations, the non-observability of quantum states or the lack of copy and erase operations can be reflected semantically. It introduces the key concepts of procedural quantum programming (hybrid target architecture, operator hierarchy, quantum data types, memory management, etc.) and presents the experimental language QCL, which implements these principles.
NASA Astrophysics Data System (ADS)
Hassan, Hesham Galal
This thesis explores the proper principles and rules for creating excellent infographics that communicate information successfully and effectively. Not only does this thesis examine the creation of Infographics, it also tries to answer which format, Static or Animated Infographics, is the most effective when used as a teaching-aid framework for complex science subjects, and if compelling Infographics in the preferred format facilitate the learning experience. The methodology includes the creation of infographic using two formats (Static and Animated) of a fairly complex science subject (Phases Of The Moon), which were then tested for their efficacy as a whole, and the two formats were compared in terms of information comprehension and retention. My hypothesis predicts that the creation of an infographic using the animated format would be more effective in communicating a complex science subject (Phases Of The Moon), specifically when using 3D computer animation to visualize the topic. This would also help different types of learners to easily comprehend science subjects. Most of the animated infographics produced nowadays are created for marketing and business purposes and do not implement the analytical design principles required for creating excellent information design. I believe that science learners are still in need of more variety in their methods of learning information, and that infographics can be of great assistance. The results of this thesis study suggests that using properly designed infographics would be of great help in teaching complex science subjects that involve spatial and temporal data. This could facilitate learning science subjects and consequently impact the interest of young learners in STEM.
Morris, Alan H
2018-02-01
Our education system seems to fail to enable clinicians to broadly understand core physiological principles. The emphasis on reductionist science, including "omics" branches of research, has likely contributed to this decrease in understanding. Consequently, clinicians cannot be expected to consistently make clinical decisions linked to best physiological evidence. This is a large-scale problem with multiple determinants, within an even larger clinical decision problem: the failure of clinicians to consistently link their decisions to best evidence. Clinicians, like all human decision-makers, suffer from significant cognitive limitations. Detailed context-sensitive computer protocols can generate personalized medicine instructions that are well matched to individual patient needs over time and can partially resolve this problem.
The neural circuits for arithmetic principles.
Liu, Jie; Zhang, Han; Chen, Chuansheng; Chen, Hui; Cui, Jiaxin; Zhou, Xinlin
2017-02-15
Arithmetic principles are the regularities underlying arithmetic computation. Little is known about how the brain supports the processing of arithmetic principles. The current fMRI study examined neural activation and functional connectivity during the processing of verbalized arithmetic principles, as compared to numerical computation and general language processing. As expected, arithmetic principles elicited stronger activation in bilateral horizontal intraparietal sulcus and right supramarginal gyrus than did language processing, and stronger activation in left middle temporal lobe and left orbital part of inferior frontal gyrus than did computation. In contrast, computation elicited greater activation in bilateral horizontal intraparietal sulcus (extending to posterior superior parietal lobule) than did either arithmetic principles or language processing. Functional connectivity analysis with the psychophysiological interaction approach (PPI) showed that left temporal-parietal (MTG-HIPS) connectivity was stronger during the processing of arithmetic principle and language than during computation, whereas parietal-occipital connectivities were stronger during computation than during the processing of arithmetic principles and language. Additionally, the left fronto-parietal (orbital IFG-HIPS) connectivity was stronger during the processing of arithmetic principles than during computation. The results suggest that verbalized arithmetic principles engage a neural network that overlaps but is distinct from the networks for computation and language processing. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Ugras, Mustafa; Asiltürk, Erol
2018-01-01
The present study aimed to determine the perceptions of science teachers on the implementation of the seven principles for good practice in education by Chickering and Gamson in their courses. Seven principles for good science education were used as a data collection tool in the survey. "The seven principles for good practice in science…
An Assessment of a Science Discipline Archive Against ISO 16363
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Downs, R. R.
2016-12-01
The Planetary Data System (PDS) is a federation of science discipline nodes formed in response to the findings of the Committee on Data Management and Computing (CODMAC 1986) that a "wealth of science data would ultimately cease to be useful and probably lost if a process was not developed to ensure that the science data were properly archived." Starting operations in 1990 the stated mission of the PDS is to "facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data and documentation produced by or relevant to NASA's planetary missions, research programs, and data analysis programs."In 2008 the PDS initiated a transition to a more modern system based on key principles found in the Archival Information System (OAIS) Reference Model (ISO 14721), a set of functional requirements provided by the designated community, and about twenty years of lessons-learned. With science digital data now being archived under the new PDS4, the PDS is a good use case to be assessed as a trusted repository against ISO 16363, a recommended practice for assessing the trustworthiness of digital repositories.This presentation will summarize the OAIS principles adopted for PDS4 and the findings of a desk assessment of the PDS against ISO 16363. Also presented will be specific items of evidence, for example the PDS mission statement above, and how they impact the level of certainty that the ISO 16363 metrics are being met.
NASA Astrophysics Data System (ADS)
Thapa, Ranjit; Kawazoe, Yoshiyuki
2017-10-01
The main objective of this meeting was to provide a platform for theoreticians and experimentalists working in the area of materials to come together and carry out cutting edge research in the field of energy by showcasing their ideas and innovations. The theme meeting was successful in attracting young researchers from both fields, sharing common research interests. Participation of more than 250 researchers in ACCMS-TM 2016 has successfully paved the way towards exchange of mutual research insights and establishment of promising research collaborations. To encourage the young participants' research efforts, three best posters, each named as ;KAWAZOE PRIZE; in theoretical category and two best posters named ;ACCMS-TM 2016 POSTER AWARD; for experimental contributions was selected. A new award named ;ACCMS MID-CAREER AWARD; for outstanding scientific contribution in the area of Computational Materials Science was constituted.
Designing an Advanced Instructional Design Advisor: Principles of Instructional Design. Volume 2
1991-05-01
ones contained in this paper would comprise a substantial part of the knowledge base for the AIDA . 14. SUBJECT TERMS IS.NUMBER OF PAGES ucigoirlive...the classroom (e.g., computer simulations models can be used to enhance CBI). The Advanced Instructional Design Advisor is a project aimed at providing... model shares with its variations. Tennyson then identifies research- based prescriptions from the cognitive sciences which should become part of ISD in
What Constitutes Science and Scientific Evidence: Roles of Null Hypothesis Testing
ERIC Educational Resources Information Center
Chang, Mark
2017-01-01
We briefly discuss the philosophical basis of science, causality, and scientific evidence, by introducing the hidden but most fundamental principle of science: the similarity principle. The principle's use in scientific discovery is illustrated with Simpson's paradox and other examples. In discussing the value of null hypothesis statistical…
Science Teachers' Perceptions of Implementing Constructivist Principles into Instruction
ERIC Educational Resources Information Center
Saunders, Saundra M.
2009-01-01
The purpose of this research study was to examine the differences in beliefs and perceptions about the implementation of constructivist principles into instruction, in support of the National Science Education Standards, for science teachers who adopt constructivist principles and those who do not. The study also examined correlations between a…
A Behavioral Science Assessment of Selected Principles of Consumer Education.
ERIC Educational Resources Information Center
Friedman, Monroe; Rees, Jennifer
1988-01-01
This study examined the bahavioral science support for a set of 20 food-buying principles. Three types of principles are found; they differ in the consumer behaviors they recommend and in the nature and strength of support they receive in the behavioral science literature. (Author/JOW)
Algorithms in nature: the convergence of systems biology and computational thinking
Navlakha, Saket; Bar-Joseph, Ziv
2011-01-01
Computer science and biology have enjoyed a long and fruitful relationship for decades. Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. Recently, these two directions have been converging. In this review, we argue that thinking computationally about biological processes may lead to more accurate models, which in turn can be used to improve the design of algorithms. We discuss the similar mechanisms and requirements shared by computational and biological processes and then present several recent studies that apply this joint analysis strategy to problems related to coordination, network analysis, and tracking and vision. We also discuss additional biological processes that can be studied in a similar manner and link them to potential computational problems. With the rapid accumulation of data detailing the inner workings of biological systems, we expect this direction of coupling biological and computational studies to greatly expand in the future. PMID:22068329
Adams, Peter; Goos, Merrilyn
2010-01-01
Modern biological sciences require practitioners to have increasing levels of knowledge, competence, and skills in mathematics and programming. A recent review of the science curriculum at the University of Queensland, a large, research-intensive institution in Australia, resulted in the development of a more quantitatively rigorous undergraduate program. Inspired by the National Research Council's BIO2010 report, a new interdisciplinary first-year course (SCIE1000) was created, incorporating mathematics and computer programming in the context of modern science. In this study, the perceptions of biological science students enrolled in SCIE1000 in 2008 and 2009 are measured. Analysis indicates that, as a result of taking SCIE1000, biological science students gained a positive appreciation of the importance of mathematics in their discipline. However, the data revealed that SCIE1000 did not contribute positively to gains in appreciation for computing and only slightly influenced students' motivation to enroll in upper-level quantitative-based courses. Further comparisons between 2008 and 2009 demonstrated the positive effect of using genuine, real-world contexts to enhance student perceptions toward the relevance of mathematics. The results support the recommendation from BIO2010 that mathematics should be introduced to biology students in first-year courses using real-world examples, while challenging the benefits of introducing programming in first-year courses. PMID:20810961
Using computers to overcome math-phobia in an introductory course in musical acoustics
NASA Astrophysics Data System (ADS)
Piacsek, Andrew A.
2002-11-01
In recent years, the desktop computer has acquired the signal processing and visualization capabilities once obtained only with expensive specialized equipment. With the appropriate A/D card and software, a PC can behave like an oscilloscope, a real-time signal analyzer, a function generator, and a synthesizer, with both audio and visual outputs. In addition, the computer can be used to visualize specific wave behavior, such as superposition and standing waves, refraction, dispersion, etc. These capabilities make the computer an invaluable tool to teach basic acoustic principles to students with very poor math skills. In this paper I describe my approach to teaching the introductory-level Physics of Musical Sound at Central Washington University, in which very few science students enroll. Emphasis is placed on how vizualization with computers can help students appreciate and apply quantitative methods for analyzing sound.
Students' explanations in complex learning of disciplinary programming
NASA Astrophysics Data System (ADS)
Vieira, Camilo
Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or representing complex phenomena that are not easy to experiment with. Despite the relevance of CSE, current professionals and scientists are not well prepared to take advantage of this set of tools and methods. Computation is usually taught in an isolated way from engineering disciplines, and therefore, engineers do not know how to exploit CSE affordances. This dissertation intends to introduce computational tools and methods contextualized within the Materials Science and Engineering curriculum. Considering that learning how to program is a complex task, the dissertation explores effective pedagogical practices that can support student disciplinary and computational learning. Two case studies will be evaluated to identify the characteristics of effective worked examples in the context of CSE. Specifically, this dissertation explores students explanations of these worked examples in two engineering courses with different levels of transparency: a programming course in materials science and engineering glass box and a thermodynamics course involving computational representations black box. Results from this study suggest that students benefit in different ways from writing in-code comments. These benefits include but are not limited to: connecting xv individual lines of code to the overall problem, getting familiar with the syntax, learning effective algorithm design strategies, and connecting computation with their discipline. Students in the glass box context generate higher quality explanations than students in the black box context. These explanations are related to students prior experiences. Specifically, students with low ability to do programming engage in a more thorough explanation process than students with high ability. This dissertation concludes proposing an adaptation to the instructional principles of worked-examples for the context of CSE education.
Information revolution in nursing and health care: educating for tomorrow's challenge.
Kooker, B M; Richardson, S S
1994-06-01
Current emphasis on the national electronic highway and a national health database for comparative health care reporting demonstrates society's increasing reliance on information technology. The efficient electronic processing and managing of data, information, and knowledge are critical for survival in tomorrow's health care organization. To take a leadership role in this information revolution, informatics nurse specialists must possess competencies that incorporate information science, computer science, and nursing science for successful information system development. In selecting an appropriate informatics educational program or to hire an individual capable of meeting this challenge, nurse administrators must look for the following technical knowledge and skill set: information management principles, system development life cycle, programming languages, file design and access, hardware and network architecture, project management skills, and leadership abilities.
Applications of large-scale density functional theory in biology
NASA Astrophysics Data System (ADS)
Cole, Daniel J.; Hine, Nicholas D. M.
2016-10-01
Density functional theory (DFT) has become a routine tool for the computation of electronic structure in the physics, materials and chemistry fields. Yet the application of traditional DFT to problems in the biological sciences is hindered, to a large extent, by the unfavourable scaling of the computational effort with system size. Here, we review some of the major software and functionality advances that enable insightful electronic structure calculations to be performed on systems comprising many thousands of atoms. We describe some of the early applications of large-scale DFT to the computation of the electronic properties and structure of biomolecules, as well as to paradigmatic problems in enzymology, metalloproteins, photosynthesis and computer-aided drug design. With this review, we hope to demonstrate that first principles modelling of biological structure-function relationships are approaching a reality.
NASA Astrophysics Data System (ADS)
Gobithaasan, R. U.; Miura, Kenjiro T.; Hassan, Mohamad Nor
2014-07-01
Computer Aided Geometric Design (CAGD) which surpasses the underlying theories of Computer Aided Design (CAD) and Computer Graphics (CG) has been taught in a number of Malaysian universities under the umbrella of Mathematical Sciences' faculty/department. On the other hand, CAD/CG is taught either under the Engineering or Computer Science Faculty. Even though CAGD researchers/educators/students (denoted as contributors) have been enriching this field of study by means of article/journal publication, many fail to convert the idea into constructive innovation due to the gap that occurs between CAGD contributors and practitioners (engineers/product/designers/architects/artists). This paper addresses this issue by advocating a number of technologies that can be used to transform CAGD contributors into innovators where immediate impact in terms of practical application can be experienced by the CAD/CG practitioners. The underlying principle of solving this issue is twofold. First would be to expose the CAGD contributors on ways to turn mathematical ideas into plug-ins and second is to impart relevant CAGD theories to CAD/CG to practitioners. Both cases are discussed in detail and the final section shows examples to illustrate the importance of turning mathematical knowledge into innovations.
NASA Astrophysics Data System (ADS)
Marzari, Nicola
The last 30 years have seen the steady and exhilarating development of powerful quantum-simulation engines for extended systems, dedicated to the solution of the Kohn-Sham equations of density-functional theory, often augmented by density-functional perturbation theory, many-body perturbation theory, time-dependent density-functional theory, dynamical mean-field theory, and quantum Monte Carlo. Their implementation on massively parallel architectures, now leveraging also GPUs and accelerators, has started a massive effort in the prediction from first principles of many or of complex materials properties, leading the way to the exascale through the combination of HPC (high-performance computing) and HTC (high-throughput computing). Challenges and opportunities abound: complementing hardware and software investments and design; developing the materials' informatics infrastructure needed to encode knowledge into complex protocols and workflows of calculations; managing and curating data; resisting the complacency that we have already reached the predictive accuracy needed for materials design, or a robust level of verification of the different quantum engines. In this talk I will provide an overview of these challenges, with the ultimate prize being the computational understanding, prediction, and design of properties and performance for novel or complex materials and devices.
The Specificity Principle in Acculturation Science.
Bornstein, Marc H
2017-01-01
The specificity principle in acculturation science asserts that specific setting conditions of specific people at specific times moderate specific domains in acculturation by specific processes. Our understanding of acculturation depends critically on what is studied where, in whom, how, and when. This article defines, explains, and illustrates the specificity principle in acculturation science. Research hypotheses about acculturation can be more adequately tested, inconsistencies and discrepancies in the acculturation literature can be satisfactorily resolved, acculturation interventions can be tailored to be more successful, and acculturation policies can be brought to new levels of effectiveness if the specificity principle that governs acculturation science is more widely recognized.
Markowitz, Dina G; DuPré, Michael J
2007-01-01
The University of Rochester's Graduate Experience in Science Education (GESE) course familiarizes biomedical science graduate students interested in pursuing academic career tracks with a fundamental understanding of some of the theory, principles, and concepts of science education. This one-semester elective course provides graduate students with practical teaching and communication skills to help them better relate science content to, and increase their confidence in, their own teaching abilities. The 2-h weekly sessions include an introduction to cognitive hierarchies, learning styles, and multiple intelligences; modeling and coaching some practical aspects of science education pedagogy; lesson-planning skills; an introduction to instructional methods such as case studies and problem-based learning; and use of computer-based instructional technologies. It is hoped that the early development of knowledge and skills about teaching and learning will encourage graduate students to continue their growth as educators throughout their careers. This article summarizes the GESE course and presents evidence on the effectiveness of this course in providing graduate students with information about teaching and learning that they will use throughout their careers.
DuPré, Michael J.
2007-01-01
The University of Rochester's Graduate Experience in Science Education (GESE) course familiarizes biomedical science graduate students interested in pursuing academic career tracks with a fundamental understanding of some of the theory, principles, and concepts of science education. This one-semester elective course provides graduate students with practical teaching and communication skills to help them better relate science content to, and increase their confidence in, their own teaching abilities. The 2-h weekly sessions include an introduction to cognitive hierarchies, learning styles, and multiple intelligences; modeling and coaching some practical aspects of science education pedagogy; lesson-planning skills; an introduction to instructional methods such as case studies and problem-based learning; and use of computer-based instructional technologies. It is hoped that the early development of knowledge and skills about teaching and learning will encourage graduate students to continue their growth as educators throughout their careers. This article summarizes the GESE course and presents evidence on the effectiveness of this course in providing graduate students with information about teaching and learning that they will use throughout their careers. PMID:17785406
Basing Science Ethics on Respect for Human Dignity.
Aközer, Mehmet; Aközer, Emel
2016-12-01
A "no ethics" principle has long been prevalent in science and has demotivated deliberation on scientific ethics. This paper argues the following: (1) An understanding of a scientific "ethos" based on actual "value preferences" and "value repugnances" prevalent in the scientific community permits and demands critical accounts of the "no ethics" principle in science. (2) The roots of this principle may be traced to a repugnance of human dignity, which was instilled at a historical breaking point in the interrelation between science and ethics. This breaking point involved granting science the exclusive mandate to pass judgment on the life worth living. (3) By contrast, respect for human dignity, in its Kantian definition as "the absolute inner worth of being human," should be adopted as the basis to ground science ethics. (4) The pathway from this foundation to the articulation of an ethical duty specific to scientific practice, i.e., respect for objective truth, is charted by Karl Popper's discussion of the ethical principles that form the basis of science. This also permits an integrated account of the "external" and "internal" ethical problems in science. (5) Principles of the respect for human dignity and the respect for objective truth are also safeguards of epistemic integrity. Plain defiance of human dignity by genetic determinism has compromised integrity of claims to knowledge in behavioral genetics and other behavioral sciences. Disregard of the ethical principles that form the basis of science threatens epistemic integrity.
Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M
2008-08-08
Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.
A Novel Multiscale QM-MD-SPH Computational Method for Heterogeneous Multicomponent Reactive Systems
2017-11-30
The first-principle study on the equation of state of hmx under high pressure. Science China Physics, Mechanics and Astronomy , 54(5):831–835, 2011...J J Monaghan, J C Lattanzio. A refined particle method for astrophysical problems. Astronomy and astrophysics 149 (1985): 135-143. 133 [148] J J...Monaghan. Smoothed particle hydrodynamics. Annual review of astronomy and astrophysics 30.1 (1992): 543-574. [149] J P Morris. A study of the
Computer presentation of data in science.
NASA Astrophysics Data System (ADS)
Simmonds, D.; Reynolds, L.
Contents: How this book was created. Foreword. 1. Introduction. 2. Choosing your system and software. 3. Working methods. 4. Preparing manuscripts and camera-ready copy. 5. Principles of typography and layout. 6. Using type and space to show the structure of text. 7. Artwork creation and drawing tips. 8. Posters, slides and OHP transparencies. 9. Designing with colour. Glossaries 1 and 2. Appendix 1: Copyfitting. Appendix 2: Signatures and imposition. Appendix 3: Publishing and the law. Appendix 4: Working comfort.
The Beneficial Role of Random Strategies in Social and Financial Systems
NASA Astrophysics Data System (ADS)
Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea
2013-05-01
In this paper we focus on the beneficial role of random strategies in social sciences by means of simple mathematical and computational models. We briefly review recent results obtained by two of us in previous contributions for the case of the Peter principle and the efficiency of a Parliament. Then, we develop a new application of random strategies to the case of financial trading and discuss in detail our findings about forecasts of markets dynamics.
First-Principles Equation of State and Shock Compression of Warm Dense Aluminum and Hydrocarbons
NASA Astrophysics Data System (ADS)
Driver, Kevin; Soubiran, Francois; Zhang, Shuai; Militzer, Burkhard
2017-10-01
Theoretical studies of warm dense plasmas are a key component of progress in fusion science, defense science, and astrophysics programs. Path integral Monte Carlo (PIMC) and density functional theory molecular dynamics (DFT-MD), two state-of-the-art, first-principles, electronic-structure simulation methods, provide a consistent description of plasmas over a wide range of density and temperature conditions. Here, we combine high-temperature PIMC data with lower-temperature DFT-MD data to compute coherent equations of state (EOS) for aluminum and hydrocarbon plasmas. Subsequently, we derive shock Hugoniot curves from these EOSs and extract the temperature-density evolution of plasma structure and ionization behavior from pair-correlation function analyses. Since PIMC and DFT-MD accurately treat effects of atomic shell structure, we find compression maxima along Hugoniot curves attributed to K-shell and L-shell ionization, which provide a benchmark for widely-used EOS tables, such as SESAME and LEOS, and more efficient models. LLNL-ABS-734424. Funding provided by the DOE (DE-SC0010517) and in part under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Computational resources provided by Blue Waters (NSF ACI1640776) and NERSC. K. Driver's and S. Zhang's current address is Lawrence Livermore Natl. Lab, Livermore, CA, 94550, USA.
NASA Astrophysics Data System (ADS)
Annetta, Leonard A.; Frazier, Wendy M.; Folta, Elizabeth; Holmes, Shawn; Lamb, Richard; Cheng, Meng-Tzu
2013-02-01
Designed-based research principles guided the study of 51 secondary-science teachers in the second year of a 3-year professional development project. The project entailed the creation of student-centered, inquiry-based, science, video games. A professional development model appropriate for infusing innovative technologies into standards-based curricula was employed to determine how science teacher's attitudes and efficacy where impacted while designing science-based video games. The study's mixed-method design ascertained teacher efficacy on five factors (General computer use, Science Learning, Inquiry Teaching and Learning, Synchronous chat/text, and Playing Video Games) related to technology and gaming using a web-based survey). Qualitative data in the form of online blog posts was gathered during the project to assist in the triangulation and assessment of teacher efficacy. Data analyses consisted of an Analysis of Variance and serial coding of teacher reflective responses. Results indicated participants who used computers daily have higher efficacy while using inquiry-based teaching methods and science teaching and learning. Additional emergent findings revealed possible motivating factors for efficacy. This professional development project was focused on inquiry as a pedagogical strategy, standard-based science learning as means to develop content knowledge, and creating video games as technological knowledge. The project was consistent with the Technological Pedagogical Content Knowledge (TPCK) framework where overlapping circles of the three components indicates development of an integrated understanding of the suggested relationships. Findings provide suggestions for development of standards-based science education software, its integration into the curriculum and, strategies for implementing technology into teaching practices.
Martínez-Pernía, David; González-Castán, Óscar; Huepe, David
2017-02-01
The development of rehabilitation has traditionally focused on measurements of motor disorders and measurements of the improvements produced during the therapeutic process; however, physical rehabilitation sciences have not focused on understanding the philosophical and scientific principles in clinical intervention and how they are interrelated. The main aim of this paper is to explain the foundation stones of the disciplines of physical therapy, occupational therapy, and speech/language therapy in recovery from motor disorder. To reach our goals, the mechanistic view and how it is integrated into physical rehabilitation will first be explained. Next, a classification into mechanistic therapy based on an old version (automaton model) and a technological version (cyborg model) will be shown. Then, it will be shown how physical rehabilitation sciences found a new perspective in motor recovery, which is based on functionalism, during the cognitive revolution in the 1960s. Through this cognitive theory, physical rehabilitation incorporated into motor recovery of those therapeutic strategies that solicit the activation of the brain and/or symbolic processing; aspects that were not taken into account in mechanistic therapy. In addition, a classification into functionalist rehabilitation based on a computational therapy and a brain therapy will be shown. At the end of the article, the methodological principles in physical rehabilitation sciences will be explained. It will allow us to go deeper into the differences and similarities between therapeutic mechanism and therapeutic functionalism.
Personality Theories Facilitate Integrating the Five Principles and Deducing Hypotheses for Testing
ERIC Educational Resources Information Center
Maddi, Salvatore R.
2007-01-01
Comments on the original article "A New Big Five: Fundamental Principles for an Integrative Science of Personality," by Dan P. McAdams and Jennifer L. Pals (see record 2006-03947-002). In presenting their view of personality science, McAdams and Pals (April 2006) elaborated the importance of five principles for building an integrated science of…
The Specificity Principle in Acculturation Science
Bornstein, Marc H.
2016-01-01
The Specificity Principle in Acculturation Science asserts that specific setting conditions of specific people at specific times moderate specific domains in acculturation by specific processes. Our understanding of acculturation depends critically on what is studied where, in whom, how, and when. This article defines, explains, and illustrates the Specificity Principle in Acculturation Science. Research hypotheses about acculturation can be more adequately tested, inconsistencies and discrepancies in the acculturation literature can be satisfactorily resolved, acculturation interventions can be tailored to be more successful, and acculturation policies can be brought to new levels of effectiveness if the specificity principle that governs acculturation science is more widely recognized. PMID:28073331
Tracing organizing principles: learning from the history of systems biology.
Green, Sara; Wolkenhauer, Olaf
2013-01-01
With the emergence of systems biology, the identification of organizing principles is being highlighted as a key research aim. Researchers attempt to "reverse engineer" the functional organization of biological systems using methodologies from mathematics, engineering and computer science while taking advantage of data produced by new experimental techniques. While systems biology is a relatively new approach, the quest for general principles of biological organization dates back to systems theoretic approaches in early and mid-twentieth century. The aim of this paper is to draw on this historical background in order to increase the understanding of the motivation behind the search for general principles and to clarify different epistemic aims within systems biology. We pinpoint key aspects of earlier approaches that also underlie the current practice. These are i) the focus on relational and system-level properties, ii) the inherent critique of reductionism and fragmentation of knowledge resulting from overspecialization, and iii) the insight that the ideal of formulating abstract organizing principles is complementary to, rather than conflicting with, the aim of formulating detailed explanations of biological mechanisms. We argue that looking back not only helps us understand the current practice but also points to possible future directions for systems biology.
Progress towards an effective model for FeSe from high-accuracy first-principles quantum Monte Carlo
NASA Astrophysics Data System (ADS)
Busemeyer, Brian; Wagner, Lucas K.
While the origin of superconductivity in the iron-based materials is still controversial, the proximity of the superconductivity to magnetic order is suggestive that magnetism may be important. Our previous work has suggested that first-principles Diffusion Monte Carlo (FN-DMC) can capture magnetic properties of iron-based superconductors that density functional theory (DFT) misses, but which are consistent with experiment. We report on the progress of efforts to find simple effective models consistent with the FN-DMC description of the low-lying Hilbert space of the iron-based superconductor, FeSe. We utilize a procedure outlined by Changlani et al.[1], which both produces parameter values and indications of whether the model is a good description of the first-principles Hamiltonian. Using this procedure, we evaluate several models of the magnetic part of the Hilbert space found in the literature, as well as the Hubbard model, and a spin-fermion model. We discuss which interaction parameters are important for this material, and how the material-specific properties give rise to these interactions. U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program under Award No. FG02-12ER46875, as well as the NSF Graduate Research Fellowship Program.
NASA Astrophysics Data System (ADS)
Chang, S. S. L.
State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.
1991-09-01
level are, by necessity, designed to be accomplished by one or a few students in the course of a single academic term. Moreover, the software is seldom...that are covered in Computer Science curricula today, but with more of an engineering structure added. A stronger engineering design component is...ing, and sound software design principles found throughout Ada, and they are unambiguously specified. These are not features which were grafted onto a
1994-01-01
0 The Mission of AGARD 0 According to its Charter, the mission of AGARD is to bring together the leading personalities of the NATO nations in the...advances in the aerospace sciences relevant to strengthening the common defence posture; • - Improving the co-operation among member nations in aerospace...for the physical principles. To construct the relevant equations for fluid gas consisting of pseudo particles, 10 is the internal energy due motion it
Biomolecular computers with multiple restriction enzymes.
Sakowski, Sebastian; Krasinski, Tadeusz; Waldmajer, Jacek; Sarnik, Joanna; Blasiak, Janusz; Poplawski, Tomasz
2017-01-01
The development of conventional, silicon-based computers has several limitations, including some related to the Heisenberg uncertainty principle and the von Neumann "bottleneck". Biomolecular computers based on DNA and proteins are largely free of these disadvantages and, along with quantum computers, are reasonable alternatives to their conventional counterparts in some applications. The idea of a DNA computer proposed by Ehud Shapiro's group at the Weizmann Institute of Science was developed using one restriction enzyme as hardware and DNA fragments (the transition molecules) as software and input/output signals. This computer represented a two-state two-symbol finite automaton that was subsequently extended by using two restriction enzymes. In this paper, we propose the idea of a multistate biomolecular computer with multiple commercially available restriction enzymes as hardware. Additionally, an algorithmic method for the construction of transition molecules in the DNA computer based on the use of multiple restriction enzymes is presented. We use this method to construct multistate, biomolecular, nondeterministic finite automata with four commercially available restriction enzymes as hardware. We also describe an experimental applicaton of this theoretical model to a biomolecular finite automaton made of four endonucleases.
Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...
2016-11-01
A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less
A symbiotic approach to fluid equations and non-linear flux-driven simulations of plasma dynamics
NASA Astrophysics Data System (ADS)
Halpern, Federico
2017-10-01
The fluid framework is ubiquitous in studies of plasma transport and stability. Typical forms of the fluid equations are motivated by analytical work dating several decades ago, before computer simulations were indispensable, and can be, therefore, not optimal for numerical computation. We demonstrate a new first-principles approach to obtaining manifestly consistent, skew-symmetric fluid models, ensuring internal consistency and conservation properties even in discrete form. Mass, kinetic, and internal energy become quadratic (and always positive) invariants of the system. The model lends itself to a robust, straightforward discretization scheme with inherent non-linear stability. A simpler, drift-ordered form of the equations is obtained, and first results of their numerical implementation as a binary framework for bulk-fluid global plasma simulations are demonstrated. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, Theory Program, under Award No. DE-FG02-95ER54309.
Toward a first-principles integrated simulation of tokamak edge plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, C S; Klasky, Scott A; Cummings, Julian
2008-01-01
Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less
Lord, Louis-David; Stevner, Angus B.; Kringelbach, Morten L.
2017-01-01
To survive in an ever-changing environment, the brain must seamlessly integrate a rich stream of incoming information into coherent internal representations that can then be used to efficiently plan for action. The brain must, however, balance its ability to integrate information from various sources with a complementary capacity to segregate information into modules which perform specialized computations in local circuits. Importantly, evidence suggests that imbalances in the brain's ability to bind together and/or segregate information over both space and time is a common feature of several neuropsychiatric disorders. Most studies have, however, until recently strictly attempted to characterize the principles of integration and segregation in static (i.e. time-invariant) representations of human brain networks, hence disregarding the complex spatio-temporal nature of these processes. In the present Review, we describe how the emerging discipline of whole-brain computational connectomics may be used to study the causal mechanisms of the integration and segregation of information on behaviourally relevant timescales. We emphasize how novel methods from network science and whole-brain computational modelling can expand beyond traditional neuroimaging paradigms and help to uncover the neurobiological determinants of the abnormal integration and segregation of information in neuropsychiatric disorders. This article is part of the themed issue ‘Mathematical methods in medicine: neuroscience, cardiology and pathology’. PMID:28507228
The principles, definition and dimensions of the new nutrition science.
Beauman, Christopher; Cannon, Geoffrey; Elmadfa, Ibrahim; Glasauer, Peter; Hoffmann, Ingrid; Keller, Markus; Krawinkel, Michael; Lang, Tim; Leitzmann, Claus; Lötsch, Bernd; Margetts, Barrie M; McMichael, Anthony J; Meyer-Abich, Klaus; Oltersdorf, Ulrich; Pettoello-Mantovani, Massimo; Sabaté, Joan; Shetty, Prakash; Sória, Marco; Spiekermann, Uwe; Tudge, Colin; Vorster, Hester H; Wahlqvist, Mark; Zerilli-Marimò, Mariuccia
2005-09-01
To specify the principles, definition and dimensions of the new nutrition science. To identify nutrition, with its application in food and nutrition policy, as a science with great width and breadth of vision and scope, in order that it can fully contribute to the preservation, maintenance, development and sustenance of life on Earth. A brief overview shows that current conventional nutrition is defined as a biological science, although its governing and guiding principles are implicit only, and no generally agreed definition is evident. Following are agreements on the principles, definition and dimensions of the new nutrition science, made by the authors as participants at a workshop on this theme held on 5-8 April 2005 at the Schloss Rauischholzhausen, Justus-Liebig University, Giessen, Germany. Nutrition science as here specified will retain its current 'classical' identity as a biological science, within a broader and integrated conceptual framework, and will also be confirmed as a social and environmental science. As such it will be concerned with personal and population health, and with planetary health--the welfare and future of the whole physical and living world of which humans are a part.
First-principles simulations of shock front propagation in liquid deuterium
NASA Astrophysics Data System (ADS)
Gygi, Francois; Galli, Giulia
2001-03-01
We present large-scale first-principles molecular dynamics simulations of the formation and propagation of a shock front in liquid deuterium. Molecular deuterium was subjected to supersonic impacts at velocities ranging from 10 to 30 km/s. We used Density Functional Theory in the local density approximation, and simulation cells containing 1320 deuterium atoms. The formation of a shock front was observed and its velocity was measured and compared with the results of laser-driven shock experiments [1]. The pressure and density in the compressed fluid were also computed directly from statistical averages in appropriate regions of the simulation cell, and compared with previous first-principles calculations performed at equilibrium [2]. Details of the electronic structure at the shock front, and their influence on the properties of the compressed fluid will be discussed. [1] J.W.Collins et al. Science 281, 1178 (1998). [2] G.Galli, R.Q.Hood, A.U.Hazi and F.Gygi, Phys.Rev. B61, 909 (2000).
Sensitivity to the Sampling Process Emerges From the Principle of Efficiency.
Jara-Ettinger, Julian; Sun, Felix; Schulz, Laura; Tenenbaum, Joshua B
2018-05-01
Humans can seamlessly infer other people's preferences, based on what they do. Broadly, two types of accounts have been proposed to explain different aspects of this ability. The first account focuses on spatial information: Agents' efficient navigation in space reveals what they like. The second account focuses on statistical information: Uncommon choices reveal stronger preferences. Together, these two lines of research suggest that we have two distinct capacities for inferring preferences. Here we propose that this is not the case, and that spatial-based and statistical-based preference inferences can be explained by the assumption that agents are efficient alone. We show that people's sensitivity to spatial and statistical information when they infer preferences is best predicted by a computational model of the principle of efficiency, and that this model outperforms dual-system models, even when the latter are fit to participant judgments. Our results suggest that, as adults, a unified understanding of agency under the principle of efficiency underlies our ability to infer preferences. Copyright © 2018 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
A precautionary principle for dual use research in the life sciences.
Kuhlau, Frida; Höglund, Anna T; Evers, Kathinka; Eriksson, Stefan
2011-01-01
Most life science research entails dual-use complexity and may be misused for harmful purposes, e.g. biological weapons. The Precautionary Principle applies to special problems characterized by complexity in the relationship between human activities and their consequences. This article examines whether the principle, so far mainly used in environmental and public health issues, is applicable and suitable to the field of dual-use life science research. Four central elements of the principle are examined: threat, uncertainty, prescription and action. Although charges against the principle exist - for example that it stifles scientific development, lacks practical applicability and is poorly defined and vague - the analysis concludes that a Precautionary Principle is applicable to the field. Certain factors such as credibility of the threat, availability of information, clear prescriptive demands on responsibility and directives on how to act, determine the suitability and success of a Precautionary Principle. Moreover, policy-makers and researchers share a responsibility for providing and seeking information about potential sources of harm. A central conclusion is that the principle is meaningful and useful if applied as a context-dependent moral principle and allowed flexibility in its practical use. The principle may then inspire awareness-raising and the establishment of practical routines which appropriately reflect the fact that life science research may be misused for harmful purposes. © 2009 Blackwell Publishing Ltd.
Kisała, Joanna; Heclik, Kinga I; Pogocki, Krzysztof; Pogocki, Dariusz
2018-05-16
The blood-brain barrier (BBB) is a complex system controlling two-way substances traffic between circulatory (cardiovascular) system and central nervous system (CNS). It is almost perfectly crafted to regulate brain homeostasis and to permit selective transport of molecules that are essential for brain function. For potential drug candidates, the CNS-oriented neuropharmaceuticals as well as for those of primary targets in the periphery, the extent to which a substance in the circulation gains access to the CNS seems crucial. With the advent of nanopharmacology the problem of the BBB permeability for drug nano-carriers gains new significance. Compare to some other fields of medicinal chemistry, the computational science of nanodelivery is still prematured to offer the black-box type solutions, especially for the BBB-case. However, even its enormous complexity can be spell out the physical principles, and as such subjected to computation. Basic understanding of various physico-chemical parameters describing the brain uptake is required to take advantage of their usage for the BBB-nanodelivery. This mini-review provides a sketchy introduction into essential concepts allowing application of computational simulation to the BBB-nanodelivery design. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Experimental and Computational Interrogation of Fast SCR Mechanism and Active Sites on H-Form SSZ-13
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Sichi; Zheng, Yang; Gao, Feng
Experiment and density functional theory (DFT) models are combined to develop a unified, quantitative model of the mechanism and kinetics of fast selective catalytic reduction (SCR) of NO/NO2 mixtures over H-SSZ-13 zeolite. Rates, rate orders, and apparent activation energies collected under differential conditions reveal two distinct kinetic regimes. First-principles thermodynamics simulations are used to determine the relative coverages of free Brønsted sites, chemisorbed NH4+ and physisorbed NH3 as a function of reaction conditions. First-principles metadynamics calculations show that all three sites can contribute to the rate-limiting N-N bond forming step in fast SCR. The results are used to parameterize amore » kinetic model that encompasses the full range of reaction conditions and recovers observed rate orders and apparent activation energies. Observed kinetic regimes are related to changes in most-abundant surface intermediates. Financial support was provided by the National Science Foundation GAOLI program under award number 1258690-CBET. We thank the Center for Research Computing at Notre« less
First-principles Studies of Ferroelectricity in BiMnO3 Thin Films
NASA Astrophysics Data System (ADS)
Wang, Yun-Peng; Cheng, Hai-Ping
The ferroelectricity in BiMnO3 thin films is a long-standing problem. We employed a first-principles density functional theory with inclusion of the local Hubbard Coulomb (U) and exchange (J) terms. The parameters U and J are optimized to reproduce the atomic structure and the energy gap of bulk C2/c BiMnO3. With these optimal U and J parameters, the calculated ferromagnetic Curie temperature and lattice dynamics properties agree with experiments. We then studied the ferroelectricity in few-layer BiMnO3 thin films on SrTiO3(001) substrates. Our calculations identified ferroelectricity in monolayer, bilayer and trilayer BiMnO3 thin films. We find that the energy barrier for 90° rotation of electric polarization is about 3 - 4 times larger than that of conventional ferroelectric materials. This work was supported by the US Department of Energy (DOE), Office of Basic Energy Sciences (BES), under Contract No. DE-FG02-02ER45995. Computations were done using the utilities of the National Energy Research Scientific Computing Center (NERSC).
NASA Astrophysics Data System (ADS)
Ceder, Gerbrand
2007-03-01
The prediction of structure is a key problem in computational materials science that forms the platform on which rational materials design can be performed. Finding structure by traditional optimization methods on quantum mechanical energy models is not possible due to the complexity and high dimensionality of the coordinate space. An unusual, but efficient solution to this problem can be obtained by merging ideas from heuristic and ab initio methods: In the same way that scientist build empirical rules by observation of experimental trends, we have developed machine learning approaches that extract knowledge from a large set of experimental information and a database of over 15,000 first principles computations, and used these to rapidly direct accurate quantum mechanical techniques to the lowest energy crystal structure of a material. Knowledge is captured in a Bayesian probability network that relates the probability to find a particular crystal structure at a given composition to structure and energy information at other compositions. We show that this approach is highly efficient in finding the ground states of binary metallic alloys and can be easily generalized to more complex systems.
UC Merced Center for Computational Biology Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colvin, Michael; Watanabe, Masakatsu
Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
What Physicists Should Know About High Performance Computing - Circa 2002
NASA Astrophysics Data System (ADS)
Frederick, Donald
2002-08-01
High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.
Research Ethics in the Era of Personalized Medicine: Updating Science's Contract with Society
Meslin, Eric M.; Cho, Mildred K.
2010-01-01
With the completed sequence of the human genome has come the prospect of substantially improving the quality of life for millions through personalized medicine approaches. Still, any advances in this direction require research involving human subjects. For decades science and ethics have enjoyed an allegiance reflected in a common set of ethical principles and procedures guiding the conduct of research with human subjects. Some of these principles emphasize avoiding harm over maximizing benefit. In this paper we revisit the priority given to these ethical principles – particularly the principles that support a cautious approach to science – and propose a reframing of the ‘social contract’ between science and society that emphasizes reciprocity and meeting public needs. PMID:20805701
QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations
NASA Astrophysics Data System (ADS)
Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas
2008-10-01
Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de
Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?
ERIC Educational Resources Information Center
Robertson, Bill
2016-01-01
Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…
On the Origin of Charge Order in RuCl3
NASA Astrophysics Data System (ADS)
Berlijn, Tom
RuCl3 has been proposed to be a spin-orbit assisted Mott insulator close to the Kitaev-spin-liquid ground state, an exotic state of matter that could protect information in quantum computers. Recent STM experiments [M. Ziatdinov et al, Nature Communications (in press)] however, show the presence of a puzzling short-range charge order in this quasi two dimensional material. Understanding the nature of this charge order may provide a pathway towards tuning RuCl3 into the Kitaev-spin-liquid ground state. Based on first principles calculations I investigate the possibility that the observed charge order is caused by a combination of short-range magnetic correlations and strong spin-orbit coupling. From a general perspective such a mechanism could offer the exciting possibility of probing local magnetic correlations with standard STM. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division.
AMP: a science-driven web-based application for the TeraGrid
NASA Astrophysics Data System (ADS)
Woitaszek, M.; Metcalfe, T.; Shorrock, I.
The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.
The Role of Metaphysical Naturalism in Science
ERIC Educational Resources Information Center
Mahner, Martin
2012-01-01
This paper defends the view that metaphysical naturalism is a constitutive ontological principle of science in that the general empirical methods of science, such as observation, measurement and experiment, and thus the very production of empirical evidence, presuppose a no-supernature principle. It examines the consequences of metaphysical…
The 2009 Earth Science Literacy Principles
NASA Astrophysics Data System (ADS)
Wysession, M. E.; Budd, D. A.; Campbell, K. M.; Conklin, M. H.; Kappel, E. S.; Ladue, N.; Lewis, G.; Raynolds, R.; Ridky, R. W.; Ross, R. M.; Taber, J.; Tewksbury, B. J.; Tuddenham, P.
2009-12-01
In 2009, the NSF-funded Earth Science Literacy Initiative (ESLI) completed and published a document representing a community consensus about what all Americans should understand about Earth sciences. These Earth Science Literacy Principles, presented as a printed brochure and on the Internet at www.earthscienceliteracy.org, were created through the work of nearly 1000 geoscientists and geoeducators who helped identify nine “big ideas” and seventy-five “supporting concepts” fundamental to terrestrial geosciences. The content scope involved the geosphere and land-based hydrosphere as addressed by the NSF-EAR program, including the fields of geobiology and low-temperature geochemistry, geomorphology and land-use dynamics, geophysics, hydrologic sciences, petrology and geochemistry, sedimentary geology and paleobiology, and tectonics. The ESLI Principles were designed to complement similar documents from the ocean, atmosphere, and climate research communities, with the long-term goal of combining these separate literacy documents into a single Earth System Science literacy framework. The aim of these principles is to educate the public, shape the future of geoscience education, and help guide the development of government policy related to Earth science. For example, K-12 textbooks are currently being written and museum exhibits constructed with these Principles in hand. NPR-funded educational videos are in the process of being made in alignment with the ESLP Principles. US House and Senate representatives on science and education committees have been made aware that the major geoscience organizations have endorsed such a document generated and supported by the community. Given the importance of Earth science in so many societally relevant topics such as climate change, energy and mineral resources, water availability, natural hazards, agriculture, and human impacts on the biosphere, efforts should be taken to ensure that this document is in a position to assist in areas such as the creation of educational products and standards and the setting of relevant government policy. In order to increase the reach of the ESLI Principles, the document has been translated into Spanish, and other languages are also being considered. The document will undergo annual updating in response to growth and change in the scientific understandings of Earth science.
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
Dynamical minimalism: why less is more in psychology.
Nowak, Andrzej
2004-01-01
The principle of parsimony, embraced in all areas of science, states that simple explanations are preferable to complex explanations in theory construction. Parsimony, however, can necessitate a trade-off with depth and richness in understanding. The approach of dynamical minimalism avoids this trade-off. The goal of this approach is to identify the simplest mechanisms and fewest variables capable of producing the phenomenon in question. A dynamical model in which change is produced by simple rules repetitively interacting with each other can exhibit unexpected and complex properties. It is thus possible to explain complex psychological and social phenomena with very simple models if these models are dynamic. In dynamical minimalist theories, then, the principle of parsimony can be followed without sacrificing depth in understanding. Computer simulations have proven especially useful for investigating the emergent properties of simple models.
On quantum models of the human mind.
Wang, Hongbin; Sun, Yanlong
2014-01-01
Recent years have witnessed rapidly increasing interests in developing quantum theoretical models of human cognition. Quantum mechanisms have been taken seriously to describe how the mind reasons and decides. Papers in this special issue report the newest results in the field. Here we discuss why the two levels of commitment, treating the human brain as a quantum computer and merely adopting abstract quantum probability principles to model human cognition, should be integrated. We speculate that quantum cognition models gain greater modeling power due to a richer representation scheme. Copyright © 2013 Cognitive Science Society, Inc.
Using Science to Take a Stand: Action-Oriented Learning in an Afterschool Science Club
NASA Astrophysics Data System (ADS)
Hagenah, Sara
This dissertation study investigates what happens when students participate in an afterschool science club designed around action-oriented science instruction, a set of curriculum design principles based on social justice pedagogy. Comprised of three manuscripts written for journal publication, the dissertation includes 1) Negotiating community-based action-oriented science teaching and learning: Articulating curriculum design principles, 2) Middle school girls' socio-scientific participation pathways in an afterschool science club, and 3) Laughing and learning together: Productive science learning spaces for middle school girls. By investigating how action-oriented science design principles get negotiated, female identity development in and with science, and the role of everyday social interactions as students do productive science, this research fills gaps in the understanding of how social justice pedagogy gets enacted and negotiated among multiple stakeholders including students, teachers, and community members along what identity development looks like across social and scientific activity. This study will be of interest to educators thinking about how to enact social justice pedagogy in science learning spaces and those interested in identity development in science.
Chou, Ting-Chao
2006-09-01
The median-effect equation derived from the mass-action law principle at equilibrium-steady state via mathematical induction and deduction for different reaction sequences and mechanisms and different types of inhibition has been shown to be the unified theory for the Michaelis-Menten equation, Hill equation, Henderson-Hasselbalch equation, and Scatchard equation. It is shown that dose and effect are interchangeable via defined parameters. This general equation for the single drug effect has been extended to the multiple drug effect equation for n drugs. These equations provide the theoretical basis for the combination index (CI)-isobologram equation that allows quantitative determination of drug interactions, where CI < 1, = 1, and > 1 indicate synergism, additive effect, and antagonism, respectively. Based on these algorithms, computer software has been developed to allow automated simulation of synergism and antagonism at all dose or effect levels. It displays the dose-effect curve, median-effect plot, combination index plot, isobologram, dose-reduction index plot, and polygonogram for in vitro or in vivo studies. This theoretical development, experimental design, and computerized data analysis have facilitated dose-effect analysis for single drug evaluation or carcinogen and radiation risk assessment, as well as for drug or other entity combinations in a vast field of disciplines of biomedical sciences. In this review, selected examples of applications are given, and step-by-step examples of experimental designs and real data analysis are also illustrated. The merging of the mass-action law principle with mathematical induction-deduction has been proven to be a unique and effective scientific method for general theory development. The median-effect principle and its mass-action law based computer software are gaining increased applications in biomedical sciences, from how to effectively evaluate a single compound or entity to how to beneficially use multiple drugs or modalities in combination therapies.
NASA Astrophysics Data System (ADS)
Nakatsuji, Hiroshi
Chemistry is a science of complex subjects that occupy this universe and biological world and that are composed of atoms and molecules. Its essence is diversity. However, surprisingly, whole of this science is governed by simple quantum principles like the Schrödinger and the Dirac equations. Therefore, if we can find a useful general method of solving these quantum principles under the fermionic and/or bosonic constraints accurately in a reasonable speed, we can replace somewhat empirical methodologies of this science with purely quantum theoretical and computational logics. This is the purpose of our series of studies - called ``exact theory'' in our laboratory. Some of our documents are cited below. The key idea was expressed as the free complement (FC) theory (originally called ICI theory) that was introduced to solve the Schrödinger and Dirac equations analytically. For extending this methodology to larger systems, order N methodologies are essential, but actually the antisymmetry constraints for electronic wave functions become big constraints. Recently, we have shown that the antisymmetry rule or `dogma' can be very much relaxed when our subjects are large molecular systems. In this talk, I want to present our recent progress in our FC methodology. The purpose is to construct ``predictive quantum chemistry'' that is useful in chemical and physical researches and developments in institutes and industries
LFRic: Building a new Unified Model
NASA Astrophysics Data System (ADS)
Melvin, Thomas; Mullerworth, Steve; Ford, Rupert; Maynard, Chris; Hobson, Mike
2017-04-01
The LFRic project, named for Lewis Fry Richardson, aims to develop a replacement for the Met Office Unified Model in order to meet the challenges which will be presented by the next generation of exascale supercomputers. This project, a collaboration between the Met Office, STFC Daresbury and the University of Manchester, builds on the earlier GungHo project to redesign the dynamical core, in partnership with NERC. The new atmospheric model aims to retain the performance of the current ENDGame dynamical core and associated subgrid physics, while also enabling a far greater scalability and flexibility to accommodate future supercomputer architectures. Design of the model revolves around a principle of a 'separation of concerns', whereby the natural science aspects of the code can be developed without worrying about the underlying architecture, while machine dependent optimisations can be carried out at a high level. These principles are put into practice through the development of an autogenerated Parallel Systems software layer (known as the PSy layer) using a domain-specific compiler called PSyclone. The prototype model includes a re-write of the dynamical core using a mixed finite element method, in which different function spaces are used to represent the various fields. It is able to run in parallel with MPI and OpenMP and has been tested on over 200,000 cores. In this talk an overview of the both the natural science and computational science implementations of the model will be presented.
2017-08-01
principles for effective Computer-Based Training (CBT) that can be applied broadly to Army courses to build and evaluate exemplar CBT for Army advanced...individual training courses. To assist cadre who do not have a dedicated instructional design team, the Computer-Based Training Principles Guide was...document is the resulting contents, organization, and presentation style of the Computer- Based Training Principles Guide and its companion User’s Guide
Jade: using on-demand cloud analysis to give scientists back their flow
NASA Astrophysics Data System (ADS)
Robinson, N.; Tomlinson, J.; Hilson, A. J.; Arribas, A.; Powell, T.
2017-12-01
The UK's Met Office generates 400 TB weather and climate data every day by running physical models on its Top 20 supercomputer. As data volumes explode, there is a danger that analysis workflows become dominated by watching progress bars, and not thinking about science. We have been researching how we can use distributed computing to allow analysts to process these large volumes of high velocity data in a way that's easy, effective and cheap.Our prototype analysis stack, Jade, tries to encapsulate this. Functionality includes: An under-the-hood Dask engine which parallelises and distributes computations, without the need to retrain analysts Hybrid compute clusters (AWS, Alibaba, and local compute) comprising many thousands of cores Clusters which autoscale up/down in response to calculation load using Kubernetes, and balances the cluster across providers based on the current price of compute Lazy data access from cloud storage via containerised OpenDAP This technology stack allows us to perform calculations many orders of magnitude faster than is possible on local workstations. It is also possible to outperform dedicated local compute clusters, as cloud compute can, in principle, scale to much larger scales. The use of ephemeral compute resources also makes this implementation cost efficient.
Biomolecular computers with multiple restriction enzymes
Sakowski, Sebastian; Krasinski, Tadeusz; Waldmajer, Jacek; Sarnik, Joanna; Blasiak, Janusz; Poplawski, Tomasz
2017-01-01
Abstract The development of conventional, silicon-based computers has several limitations, including some related to the Heisenberg uncertainty principle and the von Neumann “bottleneck”. Biomolecular computers based on DNA and proteins are largely free of these disadvantages and, along with quantum computers, are reasonable alternatives to their conventional counterparts in some applications. The idea of a DNA computer proposed by Ehud Shapiro’s group at the Weizmann Institute of Science was developed using one restriction enzyme as hardware and DNA fragments (the transition molecules) as software and input/output signals. This computer represented a two-state two-symbol finite automaton that was subsequently extended by using two restriction enzymes. In this paper, we propose the idea of a multistate biomolecular computer with multiple commercially available restriction enzymes as hardware. Additionally, an algorithmic method for the construction of transition molecules in the DNA computer based on the use of multiple restriction enzymes is presented. We use this method to construct multistate, biomolecular, nondeterministic finite automata with four commercially available restriction enzymes as hardware. We also describe an experimental applicaton of this theoretical model to a biomolecular finite automaton made of four endonucleases. PMID:29064510
How WebQuests Can Enhance Science Learning Principles in the Classroom
ERIC Educational Resources Information Center
Subramaniam, Karthigeyan
2012-01-01
This article examines the merits of WebQuests in facilitating students' in-depth understanding of science concepts using the four principles of learning gathered from the National Research Council reports "How People Learn: Brain, Mind, Experience, and School" (1999) and the "How Students Learn: Science in the Classroom" (2005) as an analytic…
The Principles of Science Education in Today's Schools. A Roundtable
ERIC Educational Resources Information Center
Russian Education and Society, 2006
2006-01-01
This article presents the dialogue from a roundtable discussion on the principles of science education in today's school held by "Pedagogika" in March 2004. Participants were as follows: from the Russian Academy of Education: V.P. Borisenkov, doctor of pedagogical sciences, professor, vice president of the Russian Academy of Education,…
Towards Reproducibility in Computational Hydrology
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit
2017-04-01
Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as the example application area, we believe that our conclusions are of value to the wider environmental and geoscience community as far as the use of code and models for scientific advancement is concerned. References: [1] Hutton, C., T. Wagener, J. Freer, D. Han, C. Duffy, and B. Arheimer (2016), Most computational hydrology is not reproducible, so is it really science?, Water Resour. Res., 52, 7548-7555, doi:10.1002/2016WR019285. [2] Ceola, S., et al. (2015), Virtual laboratories: New opportunities for collaborative water science, Hydrol. Earth Syst. Sci. Discuss., 11(12), 13443-13478, doi:10.5194/hessd-11-13443-2014.
The image recognition based on neural network and Bayesian decision
NASA Astrophysics Data System (ADS)
Wang, Chugege
2018-04-01
The artificial neural network began in 1940, which is an important part of artificial intelligence. At present, it has become a hot topic in the fields of neuroscience, computer science, brain science, mathematics, and psychology. Thomas Bayes firstly reported the Bayesian theory in 1763. After the development in the twentieth century, it has been widespread in all areas of statistics. In recent years, due to the solution of the problem of high-dimensional integral calculation, Bayesian Statistics has been improved theoretically, which solved many problems that cannot be solved by classical statistics and is also applied to the interdisciplinary fields. In this paper, the related concepts and principles of the artificial neural network are introduced. It also summarizes the basic content and principle of Bayesian Statistics, and combines the artificial neural network technology and Bayesian decision theory and implement them in all aspects of image recognition, such as enhanced face detection method based on neural network and Bayesian decision, as well as the image classification based on the Bayesian decision. It can be seen that the combination of artificial intelligence and statistical algorithms has always been the hot research topic.
Mars Science Laboratory CHIMRA: A Device for Processing Powdered Martian Samples
NASA Technical Reports Server (NTRS)
Sunshine, Daniel
2010-01-01
The CHIMRA is an extraterrestrial sample acquisition and processing device for the Mars Science Laboratory that emphasizes robustness and adaptability through design configuration. This work reviews the guidelines utilized to invent the initial CHIMRA and the strategy employed in advancing the design; these principles will be discussed in relation to both the final CHIMRA design and similar future devices. The computational synthesis necessary to mature a boxed-in impact-generating mechanism will be presented alongside a detailed mechanism description. Results from the development testing required to advance the design for a highly-loaded, long-life and high-speed bearing application will be presented. Lessons learned during the assembly and testing of this subsystem as well as results and lessons from the sample-handling development test program will be reviewed.
NASA Astrophysics Data System (ADS)
Duggan-Haas, D.
2013-12-01
The Next Generation Science Standards and the Frameworks upon which they are built, built upon and synthesized a wide range of educational research and development that came before them. For the Earth sciences, this importantly includes a series of initiatives to define literacy within oceanography, atmospheric and climate sciences, and geology. Since the publication of the Frameworks, a similarly structured set of principles for energy literacy was also published. Each set of principles includes seven to nine Essential Principles or Big Ideas, all written at the commencement level. Each of these Principles is undergirded by several Fundamental Concepts. This set of idea sets yields 38 Essential Principles and 247 Fundamental Concepts. How do these relate to the content of NGSS? How can teachers, professional development providers and curriculum specialists make sense of this array of ideas and place it into a coherent conceptual framework? This presentation will answer these questions and more. Of course, there is substantial overlap amongst the sets of principles and with the ideas, practices and principles in NGSS. This presentation will provide and describe a framework that identifies these areas of overlap and contextualizes them within a framework that makes them more manageable for educators and learners. A set of five bigger ideas and a pair of overarching questions assembled with the Essential Principles and Earth & Space Science Disciplinary Core Ideas in the form of a 'Rainbow Chart' shows a consistency of thought across Earth science's sub-disciplines and helps educators navigate this somewhat overwhelming landscape of ideas. These questions and ideas are shown in the included figure and listed below. Overarching Questions: - How do we know what we know? - How does what we know inform our decision making? Bigger Ideas: - Earth is a system of systems. - The flow of energy drives the cycling of matter. - Life, including human life, influences and is influenced by the environment. - Physical and chemical principles are unchanging and drive both gradual and rapid changes in the Earth system. - To understand (deep) space and time, models and maps are necessary. What do the colors mean? Each bigger idea has a unique color, and the overarching questions tie this rainbow of colors together and appear white when ideas or principles from the other idea sets reflect the nature of science that is inherent in the overarching questions. The highlighting indicates that each set of literacy principles addresses all Bigger Ideas and the overarching questions. The presentation will also address the way teachers within our professional development programming have used the framework in their instruction. The Rainbow Chart of Earth Science Bigger Ideas
Noel, Jean-Paul; Blanke, Olaf; Serino, Andrea
2018-06-06
Integrating information across sensory systems is a critical step toward building a cohesive representation of the environment and one's body, and as illustrated by numerous illusions, scaffolds subjective experience of the world and self. In the last years, classic principles of multisensory integration elucidated in the subcortex have been translated into the language of statistical inference understood by the neocortical mantle. Most importantly, a mechanistic systems-level description of multisensory computations via probabilistic population coding and divisive normalization is actively being put forward. In parallel, by describing and understanding bodily illusions, researchers have suggested multisensory integration of bodily inputs within the peripersonal space as a key mechanism in bodily self-consciousness. Importantly, certain aspects of bodily self-consciousness, although still very much a minority, have been recently casted under the light of modern computational understandings of multisensory integration. In doing so, we argue, the field of bodily self-consciousness may borrow mechanistic descriptions regarding the neural implementation of inference computations outlined by the multisensory field. This computational approach, leveraged on the understanding of multisensory processes generally, promises to advance scientific comprehension regarding one of the most mysterious questions puzzling humankind, that is, how our brain creates the experience of a self in interaction with the environment. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
Do Racial and Gender Disparities Exist in Newer Glaucoma Treatments?
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Office of The Director)
As a national laboratory Argonne concentrates on scientific and technological challenges that can only be addressed through a sustained, interdisciplinary focus at a national scale. Argonne's eight major initiatives, as enumerated in its strategic plan, are Hard X-ray Sciences, Leadership Computing, Materials and Molecular Design and Discovery, Energy Storage, Alternative Energy and Efficiency, Nuclear Energy, Biological and Environmental Systems, and National Security. The purposes of Argonne's Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel technical concepts, enhance the Laboratory's research and development (R and D) capabilities, and pursue its strategic goals. projects are selectedmore » from proposals for creative and innovative R and D studies that require advance exploration before they are considered to be sufficiently developed to obtain support through normal programmatic channels. Among the aims of the projects supported by the LDRD Program are the following: establishment of engineering proof of principle, assessment of design feasibility for prospective facilities, development of instrumentation or computational methods or systems, and discoveries in fundamental science and exploratory development.« less
How to build better memory training games
Deveau, Jenni; Jaeggi, Susanne M.; Zordan, Victor; Phung, Calvin; Seitz, Aaron R.
2015-01-01
Can we create engaging training programs that improve working memory (WM) skills? While there are numerous procedures that attempt to do so, there is a great deal of controversy regarding their efficacy. Nonetheless, recent meta-analytic evidence shows consistent improvements across studies on lab-based tasks generalizing beyond the specific training effects (Au et al., 2014; Karbach and Verhaeghen, 2014), however, there is little research into how WM training aids participants in their daily life. Here we propose that incorporating design principles from the fields of Perceptual Learning (PL) and Computer Science might augment the efficacy of WM training, and ultimately lead to greater learning and transfer. In particular, the field of PL has identified numerous mechanisms (including attention, reinforcement, multisensory facilitation and multi-stimulus training) that promote brain plasticity. Also, computer science has made great progress in the scientific approach to game design that can be used to create engaging environments for learning. We suggest that approaches integrating knowledge across these fields may lead to a more effective WM interventions and better reflect real world conditions. PMID:25620916
Information security: where computer science, economics and psychology meet.
Anderson, Ross; Moore, Tyler
2009-07-13
Until ca. 2000, information security was seen as a technological discipline, based on computer science but with mathematics helping in the design of ciphers and protocols. That perspective started to change as researchers and practitioners realized the importance of economics. As distributed systems are increasingly composed of machines that belong to principals with divergent interests, incentives are becoming as important to dependability as technical design. A thriving new field of information security economics provides valuable insights not just into 'security' topics such as privacy, bugs, spam and phishing, but into more general areas of system dependability and policy. This research programme has recently started to interact with psychology. One thread is in response to phishing, the most rapidly growing form of online crime, in which fraudsters trick people into giving their credentials to bogus websites; a second is through the increasing importance of security usability; and a third comes through the psychology-and-economics tradition. The promise of this multidisciplinary research programme is a novel framework for analysing information security problems-one that is both principled and effective.
ERIC Educational Resources Information Center
Gunckel, Kristin L.; Wood, Marcy B.
2016-01-01
A major challenge in preparing elementary teachers to teach inquiry-based science is finding qualified mentor teachers who use research-based approaches to teach science in their classrooms. This situation means preservice teachers often see few connections between the research-based principles for teaching science they learn in university-based…
How to Teach for Social Justice: Lessons from "Uncle Tom's Cabin" and Cognitive Science
ERIC Educational Resources Information Center
Bracher, Mark
2009-01-01
The author explains how principles of cognitive science can help teachers of literature use texts as a means of increasing students' commitment to social justice. Applying these principles to a particular work, Uncle Tom's Cabin, he calls particular attention to the relationship between cognitive science and literary schemes for building reader…
Common computational properties found in natural sensory systems
NASA Astrophysics Data System (ADS)
Brooks, Geoffrey
2009-05-01
Throughout the animal kingdom there are many existing sensory systems with capabilities desired by the human designers of new sensory and computational systems. There are a few basic design principles constantly observed among these natural mechano-, chemo-, and photo-sensory systems, principles that have been proven by the test of time. Such principles include non-uniform sampling and processing, topological computing, contrast enhancement by localized signal inhibition, graded localized signal processing, spiked signal transmission, and coarse coding, which is the computational transformation of raw data using broadly overlapping filters. These principles are outlined here with references to natural biological sensory systems as well as successful biomimetic sensory systems exploiting these natural design concepts.
Unidata: Community, Science, and Technology; in that order.
NASA Astrophysics Data System (ADS)
Young, J. W.; Ramamurthy, M. K.; Davis, E.
2015-12-01
Unidata's mission is to provide the data services, tools, and cyberinfrastructure leadership that advance Earth system science, enhance educational opportunities, and broaden participation. The Unidata community has grown from around 250 individual participants in the early years to tens of thousands of users in over 150 countries. Today, Unidata's products and services are used on every continent and by every sector of the geoscience enterprise: universities, government agencies, private sector, and other non-governmental organizations. Certain traits and ethos are shared by and common to most successful organizations. They include a healthy organizational culture grounded by some core values and guiding principles. In that environment, there is an implicit awareness of the connection between mission of an organization, its values, and its day-to-day activities, and behaviours of a passionate staff. Distinguishing characteristics include: vigorous engagement of the community served by those organizations backed by strong and active governance, unwavering commitment to seek input and feedback from users, and trust of those users, earned over many years through consistent, dependable, and high-quality service. Meanwhile, changing data volumes and standards, new computing power, and expanding scientific questions sound continue to shape the geoscience community. These issues were the drivers for founding Unidata, a cornerstone data facility, in 1984. Advances in geoscience occur at the junction of community, science, and technology and this submission will feature lessons from Unidata's thirty year history operating at this nexus. Specifically, this presentation will feature guiding principles for the program, governance mechanisms, and approaches for balancing science and technology in a community-driven program.
Control of Breathing During Mechanical Ventilation: Who Is the Boss?
Williams, Kathleen; Hinojosa-Kurtzberg, Marina; Parthasarathy, Sairam
2011-01-01
Over the past decade, concepts of control of breathing have increasingly moved from being theoretical concepts to “real world” applied science. The purpose of this review is to examine the basics of control of breathing, discuss the bidirectional relationship between control of breathing and mechanical ventilation, and critically assess the application of this knowledge at the patient’s bedside. The principles of control of breathing remain under-represented in the training curriculum of respiratory therapists and pulmonologists, whereas the day-to-day bedside application of the principles of control of breathing continues to suffer from a lack of outcomes-based research in the intensive care unit. In contrast, the bedside application of the principles of control of breathing to ambulatory subjects with sleep-disordered breathing has out-stripped that in critically ill patients. The evolution of newer technologies, faster real-time computing abilities, and miniaturization of ventilator technology can bring the concepts of control of breathing to the bedside and benefit the critically ill patient. However, market forces, lack of scientific data, lack of research funding, and regulatory obstacles need to be surmounted. PMID:21333174
NASA Astrophysics Data System (ADS)
Tandon, K.; Egbert, G.; Siripunvaraporn, W.
2003-12-01
We are developing a modular system for three-dimensional inversion of electromagnetic (EM) induction data, using an object oriented programming approach. This approach allows us to modify the individual components of the inversion scheme proposed, and also reuse the components for variety of problems in earth science computing howsoever diverse they might be. In particular, the modularity allows us to (a) change modeling codes independently of inversion algorithm details; (b) experiment with new inversion algorithms; and (c) modify the way prior information is imposed in the inversion to test competing hypothesis and techniques required to solve an earth science problem. Our initial code development is for EM induction equations on a staggered grid, using iterative solution techniques in 3D. An example illustrated here is an experiment with the sensitivity of 3D magnetotelluric inversion to uncertainties in the boundary conditions required for regional induction problems. These boundary conditions should reflect the large-scale geoelectric structure of the study area, which is usually poorly constrained. In general for inversion of MT data, one fixes boundary conditions at the edge of the model domain, and adjusts the earth?s conductivity structure within the modeling domain. Allowing for errors in specification of the open boundary values is simple in principle, but no existing inversion codes that we are aware of have this feature. Adding a feature such as this is straightforward within the context of the modular approach. More generally, a modular approach provides an efficient methodology for setting up earth science computing problems to test various ideas. As a concrete illustration relevant to EM induction problems, we investigate the sensitivity of MT data near San Andreas Fault at Parkfield (California) to uncertainties in the regional geoelectric structure.
Calculation of the Curie temperature of Ni using first principles based Wang-Landau Monte-Carlo
NASA Astrophysics Data System (ADS)
Eisenbach, Markus; Yin, Junqi; Li, Ying Wai; Nicholson, Don
2015-03-01
We combine constrained first principles density functional with a Wang-Landau Monte Carlo algorithm to calculate the Curie temperature of Ni. Mapping the magnetic interactions in Ni onto a Heisenberg like model to underestimates the Curie temperature. Using a model we show that the addition of the magnitude of the local magnetic moments can account for the difference in the calculated Curie temperature. For ab initio calculations, we have extended our Locally Selfconsistent Multiple Scattering (LSMS) code to constrain the magnitude of the local moments in addition to their direction and apply the Replica Exchange Wang-Landau method to sample the larger phase space efficiently to investigate Ni where the fluctuation in the magnitude of the local magnetic moments is of importance equal to their directional fluctuations. We will present our results for Ni where we compare calculations that consider only the moment directions and those including fluctuations of the magnetic moment magnitude on the Curie temperature. This research was sponsored by the Department of Energy, Offices of Basic Energy Science and Advanced Computing. We used Oak Ridge Leadership Computing Facility resources at Oak Ridge National Laboratory, supported by US DOE under contract DE-AC05-00OR22725.
SENSE IT: Student Enabled Network of Sensors for the Environment using Innovative Technology
NASA Astrophysics Data System (ADS)
Hotaling, L. A.; Stolkin, R.; Kirkey, W.; Bonner, J. S.; Lowes, S.; Lin, P.; Ojo, T.
2010-12-01
SENSE IT is a project funded by the National Science Foundation (NSF) which strives to enrich science, technology, engineering and mathematics (STEM) education by providing teacher professional development and classroom projects in which high school students build from first principles, program, test and deploy sensors for water quality monitoring. Sensor development is a broad and interdisciplinary area, providing motivating scenarios in which to teach a multitude of STEM subjects, from mathematics and physics to biology and environmental science, while engaging students with hands on problems that reinforce conventional classroom learning by re-presenting theory as practical tools for building real-life working devices. The SENSE IT program is currently developing and implementing a set of high school educational modules which teach environmental science and basic engineering through the lens of fundamental STEM principles, at the same time introducing students to a new set of technologies that are increasingly important in the world of environmental research. Specifically, the project provides students with the opportunity to learn the engineering design process through the design, construction, programming and testing of a student-implemented water monitoring network in the Hudson and St. Lawrence Rivers in New York. These educational modules are aligned to state and national technology and science content standards and are designed to be compatible with standard classroom curricula to support a variety of core science, technology and mathematics classroom material. For example, while designing, programming and calibrating the sensors, the students are led through a series of tasks in which they must use core mathematics and physics theory to solve the real problems of making their sensors work. In later modules, students can explore environmental science and environmental engineering curricula while deploying and monitoring their sensors in local rivers. This presentation will provide an overview of the educational modules. A variety of sensors will be described, which are suitably simple for design and construction from first principles by high school students while being accurate enough for students to make meaningful environmental measurements. The presentation will also describe how the sensor building activities can be tied to core curricula classroom theory, enabling the modules to be utilized in regular classes by mathematics, science and computing teachers without disrupting their semester’s teaching goals. Furthermore, the presentation will address of the first two years of the SENSE IT project, during which 39 teachers have been equipped, trained on these materials, and have implemented the modules with around approximately 2,000 high school students.
Exploiting NASA's Cumulus Earth Science Cloud Archive with Services and Computation
NASA Astrophysics Data System (ADS)
Pilone, D.; Quinn, P.; Jazayeri, A.; Schuler, I.; Plofchan, P.; Baynes, K.; Ramachandran, R.
2017-12-01
NASA's Earth Observing System Data and Information System (EOSDIS) houses nearly 30PBs of critical Earth Science data and with upcoming missions is expected to balloon to between 200PBs-300PBs over the next seven years. In addition to the massive increase in data collected, researchers and application developers want more and faster access - enabling complex visualizations, long time-series analysis, and cross dataset research without needing to copy and manage massive amounts of data locally. NASA has started prototyping with commercial cloud providers to make this data available in elastic cloud compute environments, allowing application developers direct access to the massive EOSDIS holdings. In this talk we'll explain the principles behind the archive architecture and share our experience of dealing with large amounts of data with serverless architectures including AWS Lambda, the Elastic Container Service (ECS) for long running jobs, and why we dropped thousands of lines of code for AWS Step Functions. We'll discuss best practices and patterns for accessing and using data available in a shared object store (S3) and leveraging events and message passing for sophisticated and highly scalable processing and analysis workflows. Finally we'll share capabilities NASA and cloud services are making available on the archives to enable massively scalable analysis and computation in a variety of formats and tools.
Materials by Design—A Perspective From Atoms to Structures
Buehler, Markus J.
2013-01-01
Biological materials are effectively synthesized, controlled, and used for a variety of purposes—in spite of limitations in energy, quality, and quantity of their building blocks. Whereas the chemical composition of materials in the living world plays a some role in achieving functional properties, the way components are connected at different length scales defines what material properties can be achieved, how they can be altered to meet functional requirements, and how they fail in disease states and other extreme conditions. Recent work has demonstrated this by using large-scale computer simulations to predict materials properties from fundamental molecular principles, combined with experimental work and new mathematical techniques to categorize complex structure-property relationships into a systematic framework. Enabled by such categorization, we discuss opportunities based on the exploitation of concepts from distinct hierarchical systems that share common principles in how function is created, linking music to materials science. PMID:24163499
ERIC Educational Resources Information Center
Sun, Daner; Looi, Chee-Kit
2018-01-01
This paper explores the crossover between formal learning and learning in informal spaces supported by mobile technology, and proposes design principles for educators to carry out a science curriculum, namely Boundary Activity-based Science Curriculum (BAbSC). The conceptualization of the boundary object, and the principles of boundary activity as…
Evolutionary Study of Interethnic Cooperation
NASA Astrophysics Data System (ADS)
Kvasnicka, Vladimir; Pospichal, Jiri
The purpose of this communication is to present an evolutionary study of cooperation between two ethnic groups. The used model is stimulated by the seminal paper of J. D. Fearon and D. D. Laitin (Explaining Interethnic Cooperation, American Political Science Review, 90 (1996), pp. 715-735), where the iterated prisoner's dilemma was used to model intra- and interethnic interactions. We reformulated their approach in a form of evolutionary prisoner's dilemma method, where a population of strategies is evolved by applying simple reproduction process with a Darwin metaphor of natural selection (a probability of selection to the reproduction is proportional to a fitness). Our computer simulations show that an application of a principle of collective guilt does not lead to an emergence of an interethnic cooperation. When an administrator is introduced, then an emergence of interethnic cooperation may be observed. Furthermore, if the ethnic groups are of very different sizes, then the principle of collective guilt may be very devastating for smaller group so that intraethnic cooperation is destroyed. The second strategy of cooperation is called the personal responsibility, where agents that defected within interethnic interactions are punished inside of their ethnic groups. It means, unlikely to the principle of collective guilt, that there exists only one type of punishment, loosely speaking, agents are punished "personally." All the substantial computational results were checked and interpreted analytically within the theory of evolutionary stable strategies. Moreover, this theoretical approach offers mechanisms of simple scenarios explaining why some particular strategies are stable or not.
Waggle: A Framework for Intelligent Attentive Sensing and Actuation
NASA Astrophysics Data System (ADS)
Sankaran, R.; Jacob, R. L.; Beckman, P. H.; Catlett, C. E.; Keahey, K.
2014-12-01
Advances in sensor-driven computation and computationally steered sensing will greatly enable future research in fields including environmental and atmospheric sciences. We will present "Waggle," an open-source hardware and software infrastructure developed with two goals: (1) reducing the separation and latency between sensing and computing and (2) improving the reliability and longevity of sensing-actuation platforms in challenging and costly deployments. Inspired by "deep-space probe" systems, the Waggle platform design includes features that can support longitudinal studies, deployments with varying communication links, and remote management capabilities. Waggle lowers the barrier for scientists to incorporate real-time data from their sensors into their computations and to manipulate the sensors or provide feedback through actuators. A standardized software and hardware design allows quick addition of new sensors/actuators and associated software in the nodes and enables them to be coupled with computational codes both insitu and on external compute infrastructure. The Waggle framework currently drives the deployment of two observational systems - a portable and self-sufficient weather platform for study of small-scale effects in Chicago's urban core and an open-ended distributed instrument in Chicago that aims to support several research pursuits across a broad range of disciplines including urban planning, microbiology and computer science. Built around open-source software, hardware, and Linux OS, the Waggle system comprises two components - the Waggle field-node and Waggle cloud-computing infrastructure. Waggle field-node affords a modular, scalable, fault-tolerant, secure, and extensible platform for hosting sensors and actuators in the field. It supports insitu computation and data storage, and integration with cloud-computing infrastructure. The Waggle cloud infrastructure is designed with the goal of scaling to several hundreds of thousands of Waggle nodes. It supports aggregating data from sensors hosted by the nodes, staging computation, relaying feedback to the nodes and serving data to end-users. We will discuss the Waggle design principles and their applicability to various observational research pursuits, and demonstrate its capabilities.
The ethics of smart cities and urban science.
Kitchin, Rob
2016-12-28
Software-enabled technologies and urban big data have become essential to the functioning of cities. Consequently, urban operational governance and city services are becoming highly responsive to a form of data-driven urbanism that is the key mode of production for smart cities. At the heart of data-driven urbanism is a computational understanding of city systems that reduces urban life to logic and calculative rules and procedures, which is underpinned by an instrumental rationality and realist epistemology. This rationality and epistemology are informed by and sustains urban science and urban informatics, which seek to make cities more knowable and controllable. This paper examines the forms, practices and ethics of smart cities and urban science, paying particular attention to: instrumental rationality and realist epistemology; privacy, datafication, dataveillance and geosurveillance; and data uses, such as social sorting and anticipatory governance. It argues that smart city initiatives and urban science need to be re-cast in three ways: a re-orientation in how cities are conceived; a reconfiguring of the underlying epistemology to openly recognize the contingent and relational nature of urban systems, processes and science; and the adoption of ethical principles designed to realize benefits of smart cities and urban science while reducing pernicious effects.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).
The Future of Pharmaceutical Manufacturing Sciences
2015-01-01
The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial‐scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state‐of‐art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular‐based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot‐melt processing and printing‐based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3612–3638, 2015 PMID:26280993
The Future of Pharmaceutical Manufacturing Sciences.
Rantanen, Jukka; Khinast, Johannes
2015-11-01
The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial-scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state-of-art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular-based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot-melt processing and printing-based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.
Weighing the Balance of Science Literacy in Education and Public Policy
NASA Astrophysics Data System (ADS)
Buxner, S.; Impey, C.; Johnson, B.
2015-11-01
Science literacy is a concern of educators and policy makers in the United States and all over the world. Science literacy is defined by society and includes important knowledge for individuals that varies with culture and local knowledge systems. The technological societies of the western world have delegated the knowledge that underpins their everyday world to mechanics who know how their cars work, technicians who know how their computers work, and policy wonks who know how their individual choices and actions will affect the environment and their health. The scientific principles that frame and sculpt the technological world are invisible and mysterious to most people. A question for debate is whether or not this is a healthy situation or not, and if not, what to do about it. The panelists shared their prospects and challenges of building science literacy with individuals in the United States and with Tibetan monks. As they discussed their efforts working with these different populations, they shared lessons based on common issues and unique solutions based on local knowledge systems and communities of learners.
Integrating Computer Concepts into Principles of Accounting.
ERIC Educational Resources Information Center
Beck, Henry J.; Parrish, Roy James, Jr.
A package of instructional materials for an undergraduate principles of accounting course at Danville Community College was developed based upon the following assumptions: (1) the principles of accounting student does not need to be able to write computer programs; (2) computerized accounting concepts should be presented in this course; (3)…
ERIC Educational Resources Information Center
Musaitif, Linda M.
2013-01-01
Purpose: The purpose of this study was to determine the degree to which undergraduate full-time and adjunct faculty members in the health and science programs at community colleges in Southern California utilize the seven principles of good practice as measured by the Faculty Inventory of the Seven Principles for Good Practice in Undergraduate…
A Hydrological Perspective to Advance Understanding of the Water Cycle
NASA Astrophysics Data System (ADS)
Berghuijs, W.
2014-12-01
In principle hydrologists are scientists that study relationships within the water cycle. Yet, current technology makes it tempting for hydrology students to lose their "hydrological perspective" and become instead full-time computer programmers or statisticians. I assert that students should ensure their hydrological perspective thrives, notwithstanding the importance and possibilities of current technology. This perspective is necessary to advance the science of hydrology. As other hydrologists have pondered similar views before, I make no claims of originality here. I just hope that in presenting my perspective on this issue I may spark the interest of other early career hydrologists.
NSSDC activities with 12-inch optical disk drives
NASA Technical Reports Server (NTRS)
Lowrey, Barbara E.; Lopez-Swafford, Brian
1986-01-01
The development status of optical-disk data transfer and storage technology at the National Space Science Data Center (NSSDC) is surveyed. The aim of the R&D program is to facilitate the exchange of large volumes of data. Current efforts focus on a 12-inch 1-Gbyte write-once/read-many disk and a disk drive which interfaces with VAX/VMS computer systems. The history of disk development at NSSDC is traced; the results of integration and performance tests are summarized; the operating principles of the 12-inch system are explained and illustrated with diagrams; and the need for greater standardization is indicated.
10 Tips to Reduce Your Chance of Losing Vision from the Most Common Cause of Blindness
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
Effect of Graphene with Nanopores on Metal Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Hu; Chen, Xianlang; Wang, Lei
Porous graphene, which is a novel type of defective graphene, shows excellent potential as a support material for metal clusters. In this work, the stability and electronic structures of metal clusters (Pd, Ir, Rh) supported on pristine graphene and graphene with different sizes of nanopore were investigated by first-principle density functional theory (DFT) calculations. Thereafter, CO adsorption and oxidation reaction on the Pd-graphene system were chosen to evaluate its catalytic performance. Graphene with nanopore can strongly stabilize the metal clusters and cause a substantial downshift of the d-band center of the metal clusters, thus decreasing CO adsorption. All binding energies,more » d-band centers, and adsorption energies show a linear change with the size of the nanopore: a bigger size of nanopore corresponds to a stronger metal clusters bond to the graphene, lower downshift of the d-band center, and weaker CO adsorption. By using a suitable size nanopore, supported Pd clusters on the graphene will have similar CO and O2 adsorption ability, thus leading to superior CO tolerance. The DFT calculated reaction energy barriers show that graphene with nanopore is a superior catalyst for CO oxidation reaction. These properties can play an important role in instructing graphene-supported metal catalyst preparation to prevent the diffusion or agglomeration of metal clusters and enhance catalytic performance. This work was supported by National Basic Research Program of China (973Program) (2013CB733501), the National Natural Science Foundation of China (NSFC-21176221, 21136001, 21101137, 21306169, and 91334013). D. Mei acknowledges the support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC).« less
Opting in and Creating Demand: Why Young People Choose to Teach Mathematics to Each Other
NASA Astrophysics Data System (ADS)
Tucker-Raymond, Eli; Lewis, Naama; Moses, Maisha; Milner, Chad
2016-12-01
Access to science, technology, engineering, and mathematics fields serves as a key entry point to economic mobility and civic enfranchisement. Such access must take seriously the intellectual power of the knowledge and practices of non-dominant youth. In our case, this has meant to shift epistemic authority in mathematics from academic institutions to young people themselves. This article is about why high school-aged students, from underrepresented groups, choose to participate in an out-of-school time program in which they teach younger children in the domains of mathematics and computer science. It argues for programmatic principles based on access, identity engagement, relationship building, and connections to community to support underrepresented youth as learners, teachers, leaders, and organizers in mathematics-related activities using game design as the focus of activity.
Engineering models for catastrophe risk and their application to insurance
NASA Astrophysics Data System (ADS)
Dong, Weimin
2002-06-01
Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.
Complexity science and leadership in healthcare.
Burns, J P
2001-10-01
The emerging field of complexity science offers an alternative leadership strategy for the chaotic, complex healthcare environment. A survey revealed that healthcare leaders intuitively support principles of complexity science. Leadership that uses complexity principles offers opportunities in the chaotic healthcare environment to focus less on prediction and control and more on fostering relationships and creating conditions in which complex adaptive systems can evolve to produce creative outcomes.
Engineering and physical sciences in oncology: challenges and opportunities.
Mitchell, Michael J; Jain, Rakesh K; Langer, Robert
2017-11-01
The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas.
Sex differences in science museum exhibit attraction
NASA Astrophysics Data System (ADS)
Arámbula Greenfield, Teresa
This study examines the relative attraction of hands-on, interactive science museum exhibits for females and males. Studies have demonstrated that such exhibits can be effective learning experiences for children, with both academic and affective benefits. Other studies have shown that girls and boys do not always experience the same science-related educational opportunities and that, even when they do, they do not necessarily receive the same benefits from them. These early differences can lead to more serious educational and professional disparities later in life. As interactive museum exhibits represent a science experience that is-readily available to both girls and boys, the question arose as to whether they were being used similarly by the two groups as well as by adult women and men. It was found that both girls and boys used all types of exhibits, but that girls were more likely than boys to use puzzles and exhibits focusing on the human body; boys were more likely than girls to use computers and exhibits illustrating physical science principles. However, this was less true of children accompanied by adults (parents) than it was of unaccompanied children on school field trips who roamed the museum more freely.Received: 16 February 1994; Revised: 3 February 1995;
NASA Astrophysics Data System (ADS)
Musaitif, Linda M.
Purpose. The purpose of this study was to determine the degree to which undergraduate full-time and adjunct faculty members in the health and science programs at community colleges in Southern California utilize the seven principles of good practice as measured by the Faculty Inventory of the Seven Principles for Good Practice in Undergraduate Education. A second purpose was to compare degree of utilization for gender and class size. Methodology. This is a quantitative study wherein there exists a systematic and mathematical assessment of data gathered through the use of a Likert scale survey to process and determine the mathematical model of the use of the principles by the target population of both full-time and adjunct faculty of health/science programs of community colleges in Southern California. Findings. Examination of the data revealed that both full-time and adjunct faculty members of Southern California community colleges perceive themselves a high degree of utilization of the seven principles of good practice. There was no statistically significant data to suggest a discrepancy between full-time and adjunct professors' perceptions among the utilization of the seven principles. Overall, male faculty members perceived themselves as utilizing the principles to a greater degree than female faculty. Data suggest that faculty with class size 60 or larger showed to utilize the seven principles more frequently than the professors with smaller class sizes. Conclusions. Full-time and adjunct professors of the health and sciences in Southern California community colleges perceive themselves as utilizing the seven principles of good practice to a high degree. Recommendations. This study suggests many recommendations for future research, including the degree to which negative economic factors such as budget cuts and demands affect the utilization of the seven principles. Also recommended is a study comparing students' perceptions of faculty's utilization of the seven principles of good practice in the classroom with faculty's self-perception.
1986-01-01
In order to articulate a view of chemical carcinogenesis that scientists generally hold in common today and to draw upon this understanding to compose guiding principles that can be used as a bases for the efforts of the regulatory agencies to establish guidelines for assessing carcinogenic risk to meet the specific requirements of the legislative acts they are charged to implement, the Office of Science and Technology Policy, Executive Office, the White House drew on the expertise of a number of regulatory agencies to elucidate present scientific views in critical areas of the major disciplines important to the process of risk assessment. The document is composed of two major sections, Principles and the State-of-the-Science. The latter consists of subsections on the mechanisms of carcinogenesis, short-term and long-term testing, and epidemiology, which are important components in the risk assessment step of hazard identification. These subsections are followed by one on exposure assessment, and a final section which includes analyses of dose-response (hazard) assessment and risk characterization. The principles are derived from considerations in each of the subsections. Because of present gaps in understanding, the principles contain judgmental (science policy) decisions on major unresolved issues as well as statements of what is generally accepted as fact. These judgments are basically assumptions which are responsible for much of the uncertainty in the process of risk assessment. There was an attempt to clearly distinguish policy and fact. The subsections of the State-of-the-Science portion provide the underlying support to the principles articulated, and to read the "Principles" section without a full appreciation of the State-of-the-Science section is to invite oversimplification and misinterpretation. Finally, suggestions are made for future research efforts which will improve the process of risk assessment. PMID:3530737
Nontrivial Quantum Effects in Biology: A Skeptical Physicists' View
NASA Astrophysics Data System (ADS)
Wiseman, Howard; Eisert, Jens
The following sections are included: * Introduction * A Quantum Life Principle * A quantum chemistry principle? * The anthropic principle * Quantum Computing in the Brain * Nature did everything first? * Decoherence as the make or break issue * Quantum error correction * Uselessness of quantum algorithms for organisms * Quantum Computing in Genetics * Quantum search * Teleological aspects and the fast-track to life * Quantum Consciousness * Computability and free will * Time scales * Quantum Free Will * Predictability and free will * Determinism and free will * Acknowledgements * References
Physical Principle for Generation of Randomness
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)
PubMed on Tap: discovering design principles for online information delivery to handheld computers.
Hauser, Susan E; Demner-Fushman, Dina; Ford, Glenn; Thoma, George R
2004-01-01
Online access to biomedical information from handheld computers will be a valuable adjunct to other popular medical applications if information delivery systems are designed with handheld computers in mind. The goal of this project is to discover design principles to facilitate practitioners' access to online medical information at the point-of-care. A prototype system was developed to serve as a testbed for this research. Using the testbed, an initial evaluation has yielded several user interface design principles. Continued research is expected to discover additional user interface design principles as well as guidelines for results organization and system performance
Designing User-Computer Dialogues: Basic Principles and Guidelines.
ERIC Educational Resources Information Center
Harrell, Thomas H.
This discussion of the design of computerized psychological assessment or testing instruments stresses the importance of the well-designed computer-user interface. The principles underlying the three main functional elements of computer-user dialogue--data entry, data display, and sequential control--are discussed, and basic guidelines derived…
LeChâtelier's Principle in the Sciences
NASA Astrophysics Data System (ADS)
Thomsen, Volker B. E.
2000-02-01
LeChâtelier's principle of chemical equilibrium is actually a very general statement about systems in equilibrium and their behavior when subjected to external force or stress. Although one almost never finds mention of his name or law in other sciences, analogous principles and concepts do exist. In this note we examine some of the similar forms taken by this chemical principle in the fields of physics, geology, biology, and economics. Lenz's law in physics is an example of electromagnetic equilibrium and the geological principle of isostatic uplift concerns mechanical equilibrium. Both are strictly consequences of conservation of energy. LeChâtelier's principle deals with thermodynamic equilibrium and involves both the first and second laws of thermodynamics. The concept of homeostasis in biology and the economic law of supply and demand are both equilibrium-like principles, but involve systems in the steady state. However, all these principles involve the stability of the system under consideration and the analogies presented may be useful in the teaching of LeChâtelier's principle.
Conventional Principles in Science: On the foundations and development of the relativized a priori
NASA Astrophysics Data System (ADS)
Ivanova, Milena; Farr, Matt
2015-11-01
The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the relativized a priori concerning the notion of measurement, physical possibility, and the interpretation of scientific theories.
Microgravity science & applications. Program tasks and bibliography for FY 1995
NASA Technical Reports Server (NTRS)
1996-01-01
This annual report includes research projects funded by the Office of Life and Microgravity Sciences and Applications, Microgravity Science and Applications Division, during FY 1994. It is a compilation of program tasks (objective, description, significance, progress, students funded under research, and bibliographic citations) for flight research and ground based research in five major scientific disciplines: benchmark science, biotechnology, combustion science, fluid physics, and materials science. Advanced technology development (ATD) program task descriptions are also included. The bibliography cites the related principle investigator (PI) publications and presentations for these program tasks in FY 1994. Three appendices include a Table of Acronyms, a Guest Investigator index and a Principle Investigator index.
C3 Domain Analysis, Lessons Learned
1993-09-30
organize the domain. This approach is heavily based on the principles of library science and is geared toward a reuse effort with a large library-like...method adapts many principles from library science to the organization and implementation of a reuse library. C-1 DEFENSE INFORMATION SYSTEMS AGENCY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fredrickson, Daniel C
2015-06-23
Final technical report for "Chemical Frustration: A Design Principle for the Discovery of New Complex Alloy and Intermetallic Phases" funded by the Office of Science through the Materials Chemistry Program of the Office of Basic Energy Sciences.
Projective simulation for artificial intelligence
NASA Astrophysics Data System (ADS)
Briegel, Hans J.; de Las Cuevas, Gemma
2012-05-01
We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.
Ammann, Alexander
2016-01-01
"Digitality" (as opposed to "digitalization"--the conversion from the analog domain to the digital domain) will open up a whole new world that does not originate from the analog world. Contemporary research in the field of neural concepts and neuromorphic computing systems will lead to convergences between the world of digitality and the world of neuronality, giving the theme "Knowledge and Culture" a new meaning. The simulation of virtual multidimensional and contextual spaces will transform the transfer of knowledge from a uni- and bidirectional process into an interactive experience. We will learn to learn in a ubiquitous computing environment and will abandon conventional curriculum organization principles. The adaptation of individualized ontologies will result in the emergence of a new world of knowledge in which knowledge evolves from a cultural heritage into a commodity.
Matthews, M E; Norback, J P
1984-06-01
An organizational framework for integrating foodservice data into an information system for management decision making is presented. The framework involves the application to foodservice of principles developed by the disciplines of managerial economics and accounting, mathematics, computer science, and information systems. The first step is to conceptualize a foodservice system from an input-output perspective, in which inputs are units of resources available to managers and outputs are servings of menu items. Next, methods of full cost accounting, from the management accounting literature, are suggested as a mechanism for developing and assigning costs of using resources within a foodservice operation. Then matrix multiplication is used to illustrate types of information that matrix data structures could make available for management planning and control when combined with a conversational mode of computer programming.
Projective simulation for artificial intelligence
Briegel, Hans J.; De las Cuevas, Gemma
2012-01-01
We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation. PMID:22590690
NASA Astrophysics Data System (ADS)
Busch, K. C.
2012-12-01
Even though there exists a high degree of consensus among scientists about climate change, doubt has actually increased over the last five years within the general U.S. public. In 2006, 79% of those polled agreed that there is evidence for global warming, while only 59% agreed in 2010 (Pew Research Center, 2010). The source for this doubt can be partially attributed to lack of knowledge. Formal education is one mechanism that potentially can address inadequate public understanding as school is the primary place where students - and future citizens - learn about the climate. In a joint effort, several governmental agencies, non-governmental organizations, scientists and educators have created a framework called The Essential Principles of Climate Science Literacy, detailing seven concepts that are deemed vital for individuals and communities to understand Earth's climate system (USGCRP, 2009). Can students reach climate literacy - as defined by these 7 concepts - if they are taught using a curriculum based on the current state standards? To answer this question, the K-12 state science teaching and learning standards for Texas and California - two states that heavily influence nation-wide textbook creation - were compared against the Essential Principles. The data analysis consisted of two stages, looking for: 1) direct reference to "climate" and "climate change" and 2) indirect reference to the 7 Essential Principles through axial coding. The word "climate" appears in the California K-12 science standards 4 times and in the Texas standards 7 times. The word "climate change" appears in the California and Texas standards only 3 times each. Indirect references to the 7 Essential Principles of climate science literacy were more numerous. Broadly, California covered 6 of the principles while Texas covered all 7. In looking at the 7 principles, the second one "Climate is regulated by complex interactions among component of the Earth system" was the most substantively addressed. Least covered were number 6 "Human activities are impacting the climate system" and number 7 "Climate change will have consequences for the Earth system and human lives." Most references, either direct or indirect, occurred in the high school standards for earth science, a class not required for graduation in either state. This research points to the gaps between what the 7 Essential Principles of Climate Literacy defines as essential knowledge and what students may learn in their K-12 science classes. Thus, the formal system does not seem to offer an experience which can potentially develop a more knowledgeable citizenry who will be able to make wise personal and policy decisions about climate change, falling short of the ultimate goal of achieving widespread climate literacy. Especially troubling was the sparse attention to the principles addressing the human connection to the climate - principles number 6 and 7. If climate literate citizens are to make "wise personal and policy decisions" (USGCRP, 2009), these two principles especially are vital. This research, therefore, has been valuable for identifying current shortcomings in state standards.
The ethics of smart cities and urban science
2016-01-01
Software-enabled technologies and urban big data have become essential to the functioning of cities. Consequently, urban operational governance and city services are becoming highly responsive to a form of data-driven urbanism that is the key mode of production for smart cities. At the heart of data-driven urbanism is a computational understanding of city systems that reduces urban life to logic and calculative rules and procedures, which is underpinned by an instrumental rationality and realist epistemology. This rationality and epistemology are informed by and sustains urban science and urban informatics, which seek to make cities more knowable and controllable. This paper examines the forms, practices and ethics of smart cities and urban science, paying particular attention to: instrumental rationality and realist epistemology; privacy, datafication, dataveillance and geosurveillance; and data uses, such as social sorting and anticipatory governance. It argues that smart city initiatives and urban science need to be re-cast in three ways: a re-orientation in how cities are conceived; a reconfiguring of the underlying epistemology to openly recognize the contingent and relational nature of urban systems, processes and science; and the adoption of ethical principles designed to realize benefits of smart cities and urban science while reducing pernicious effects. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336794
Efficient Predictions of Excited State for Nanomaterials Using Aces 3 and 4
2017-12-20
by first-principle methods in the software package ACES by using large parallel computers, growing tothe exascale. 15. SUBJECT TERMS Computer...modeling, excited states, optical properties, structure, stability, activation barriers first principle methods , parallel computing 16. SECURITY...2 Progress with new density functional methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Jianwei; Remsing, Richard C.; Zhang, Yubo
2016-06-13
One atom or molecule binds to another through various types of bond, the strengths of which range from several meV to several eV. Although some computational methods can provide accurate descriptions of all bond types, those methods are not efficient enough for many studies (for example, large systems, ab initio molecular dynamics and high-throughput searches for functional materials). Here, we show that the recently developed non-empirical strongly constrained and appropriately normed (SCAN) meta-generalized gradient approximation (meta-GGA) within the density functional theory framework predicts accurate geometries and energies of diversely bonded molecules and materials (including covalent, metallic, ionic, hydrogen and vanmore » der Waals bonds). This represents a significant improvement at comparable efficiency over its predecessors, the GGAs that currently dominate materials computation. Often, SCAN matches or improves on the accuracy of a computationally expensive hybrid functional, at almost-GGA cost. SCAN is therefore expected to have a broad impact on chemistry and materials science.« less
Enhanced Molecular Dynamics Methods Applied to Drug Design Projects.
Ziada, Sonia; Braka, Abdennour; Diharce, Julien; Aci-Sèche, Samia; Bonnet, Pascal
2018-01-01
Nobel Laureate Richard P. Feynman stated: "[…] everything that living things do can be understood in terms of jiggling and wiggling of atoms […]." The importance of computer simulations of macromolecules, which use classical mechanics principles to describe atom behavior, is widely acknowledged and nowadays, they are applied in many fields such as material sciences and drug discovery. With the increase of computing power, molecular dynamics simulations can be applied to understand biological mechanisms at realistic timescales. In this chapter, we share our computational experience providing a global view of two of the widely used enhanced molecular dynamics methods to study protein structure and dynamics through the description of their characteristics, limits and we provide some examples of their applications in drug design. We also discuss the appropriate choice of software and hardware. In a detailed practical procedure, we describe how to set up, run, and analyze two main molecular dynamics methods, the umbrella sampling (US) and the accelerated molecular dynamics (aMD) methods.
Sun, Jianwei; Remsing, Richard C; Zhang, Yubo; Sun, Zhaoru; Ruzsinszky, Adrienn; Peng, Haowei; Yang, Zenghui; Paul, Arpita; Waghmare, Umesh; Wu, Xifan; Klein, Michael L; Perdew, John P
2016-09-01
One atom or molecule binds to another through various types of bond, the strengths of which range from several meV to several eV. Although some computational methods can provide accurate descriptions of all bond types, those methods are not efficient enough for many studies (for example, large systems, ab initio molecular dynamics and high-throughput searches for functional materials). Here, we show that the recently developed non-empirical strongly constrained and appropriately normed (SCAN) meta-generalized gradient approximation (meta-GGA) within the density functional theory framework predicts accurate geometries and energies of diversely bonded molecules and materials (including covalent, metallic, ionic, hydrogen and van der Waals bonds). This represents a significant improvement at comparable efficiency over its predecessors, the GGAs that currently dominate materials computation. Often, SCAN matches or improves on the accuracy of a computationally expensive hybrid functional, at almost-GGA cost. SCAN is therefore expected to have a broad impact on chemistry and materials science.
Designing for students' science learning using argumentation and classroom debate
NASA Astrophysics Data System (ADS)
Bell, Philip Laverne
1998-12-01
This research investigates how to design and introduce an educational innovation into a classroom setting to support learning. The research yields cognitive design principles for instruction involving scientific argumentation and debate. Specifically, eighth-grade students used a computer learning environment to construct scientific arguments and to participate in a classroom debate. The instruction was designed to help students integrate their science understanding by debating: How far does light go, does light die out over distance or go forever until absorbed? This research explores the tension between focusing students' conceptual change on specific scientific phenomena and their development of integrated understanding. I focus on the importance of connecting students' everyday experiences and intuitions to their science learning. The work reported here characterizes how students see the world through a filter of their own understanding. It explores how individual and social mechanisms in instruction support students as they expand the range of ideas under consideration and distinguish between these ideas using scientific criteria. Instruction supported students as they engaged in argumentation and debate on a set of multimedia evidence items from the World-Wide-Web. An argument editor called SenseMaker was designed and studied with the intent of making individual and group thinking visible during instruction. Over multiple classroom trials, different student cohorts were increasingly supported in scientific argumentation involving systematic coordination of evidence with theoretical ideas about light. Students' knowledge representations were used as mediating "learning artifacts" during classroom debate. Two argumentation conditions were investigated. The Full Scope group prepared to defend either theoretical position in the debate. These students created arguments that included more theoretical conjectures and made more conceptual progress in understanding light. The Personal Scope group prepared to defend their original opinion about the debate. These students produced more acausal descriptions of evidence and theorized less in their arguments. Regardless of students' prior knowledge of light, the Full Scope condition resulted in a more integrated understanding. Results from the research were synthesized in design principles geared towards helping future designers. Sharing and refining cognitive design principles offers a productive focus for developing a design science for education.
Basic Principles of Animal Science. Reprinted.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee.
The reference book is designed to fulfill the need for organized subject matter dealing with basic principles of animal science to be incorporated into the high school agriculture curriculum. The material presented is scientific knowledge basic to livestock production. Five units contain specific information on the following topics: anatomy and…
Environmental Science. An Experimental Programme for Primary Teachers.
ERIC Educational Resources Information Center
Linke, R. D.
An experimental course covering some of the fundamental principles and terminology associated with environmental science and the application of these principles to various contemporary problems is summarized in this report. The course involved a series of lectures together with a program of specific seminar and discussion topics presented by the…
Improving FCS Accountability: Increasing STEM Awareness with Interior Design Modules
ERIC Educational Resources Information Center
Etheredge, Jessica; Moody, Dana; Cooper, Ashley
2014-01-01
This paper demonstrates ways in which family and consumer sciences (FCS) educators can explore more opportunities to integrate Science, Technology, Engineering, and Math (STEM) principles into secondary education curriculum. Interior design is used as a case study for creating learning modules that incorporate STEM principles in a creative and…
Personalization vs. How People Learn
ERIC Educational Resources Information Center
Riley, Benjamin
2017-01-01
Riley asserts that some findings of cognitive science conflict with key principles of personalized learning--that students should control the content of their learning and that they should control the pace of their learning. A personalized approach is in conflict with the cognitive science principle that committing key facts in a discipline to…
ERIC Educational Resources Information Center
Huang, Tzu-Hua; Liu, Yuan-Chen
2017-01-01
This paper reflects thorough consideration of cultural perspectives in the establishment of science curriculum development principles in Taiwan. The authority explicitly states that education measures and activities of aboriginal peoples' ethnic group should be implemented consistently to incorporate their history, language, art, living customs,…
21 CFR 570.20 - General principles for evaluating the safety of food additives.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of Sciences-National Research Council. A petition will not be denied, however, by reason of the... of Sciences-National Research Council if, from available evidence, the Commissioner finds that the... purposes of this section, the principles for evaluating safety of additives set forth in the above...
Rockets: An Educator's Guide with Activities in Science, Mathematics, and Technology.
ERIC Educational Resources Information Center
National Aeronautics and Space Administration, Washington, DC.
This educational guide discusses rockets and includes activities in science, mathematics, and technology. It begins with background information on the history of rocketry, scientific principles, and practical rocketry. The sections on scientific principles and practical rocketry focus on Sir Isaac Newton's Three Laws of Motion. These laws explain…
First-Principles Thermodynamics Study of Spinel MgAl 2 O 4 Surface Stability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Qiuxia; Wang, Jian-guo; Wang, Yong
The surface stability of all possible terminations for three low-index (111, 110, 100) structures of the spinel MgAl2O4 has been studied using first-principles based thermodynamic approach. The surface Gibbs free energy results indicate that the 100_AlO2 termination is the most stable surface structure under ultra-high vacuum at T=1100 K regardless of Al-poor or Al-rich environment. With increasing oxygen pressure, the 111_O2(Al) termination becomes the most stable surface in the Al-rich environment. The oxygen vacancy formation is thermodynamically favorable over the 100_AlO2, 111_O2(Al) and the (111) structure with Mg/O connected terminations. On the basis of surface Gibbs free energies for bothmore » perfect and defective surface terminations, the 100_AlO2 and 111_O2(Al) are the most dominant surfaces in Al-rich environment under atmospheric condition. This is also consistent with our previously reported experimental observation. This work was supported by a Laboratory Directed Research and Development (LDRD) project of the Pacific Northwest National Laboratory (PNNL). The computing time was granted by the National Energy Research Scientific Computing Center (NERSC). Part of computing time was also granted by a scientific theme user proposal in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), which is a U.S. Department of Energy national scientific user facility located at PNNL in Richland, Washington.« less
NASA Astrophysics Data System (ADS)
Blikstein, Paulo
The goal of this dissertation is to explore relations between content, representation, and pedagogy, so as to understand the impact of the nascent field of complexity sciences on science, technology, engineering and mathematics (STEM) learning. Wilensky & Papert coined the term "structurations" to express the relationship between knowledge and its representational infrastructure. A change from one representational infrastructure to another they call a "restructuration." The complexity sciences have introduced a novel and powerful structuration: agent-based modeling. In contradistinction to traditional mathematical modeling, which relies on equational descriptions of macroscopic properties of systems, agent-based modeling focuses on a few archetypical micro-behaviors of "agents" to explain emergent macro-behaviors of the agent collective. Specifically, this dissertation is about a series of studies of undergraduate students' learning of materials science, in which two structurations are compared (equational and agent-based), consisting of both design research and empirical evaluation. I have designed MaterialSim, a constructionist suite of computer models, supporting materials and learning activities designed within the approach of agent-based modeling, and over four years conducted an empirical inves3 tigation of an undergraduate materials science course. The dissertation is comprised of three studies: Study 1 - diagnosis . I investigate current representational and pedagogical practices in engineering classrooms. Study 2 - laboratory studies. I investigate the cognition of students engaging in scientific inquiry through programming their own scientific models. Study 3 - classroom implementation. I investigate the characteristics, advantages, and trajectories of scientific content knowledge that is articulated in epistemic forms and representational infrastructures unique to complexity sciences, as well as the feasibility of the integration of constructionist, agent-based learning environments in engineering classrooms. Data sources include classroom observations, interviews, videotaped sessions of model-building, questionnaires, analysis of computer-generated logfiles, and quantitative and qualitative analysis of artifacts. Results shows that (1) current representational and pedagogical practices in engineering classrooms were not up to the challenge of the complex content being taught, (2) by building their own scientific models, students developed a deeper understanding of core scientific concepts, and learned how to better identify unifying principles and behaviors in materials science, and (3) programming computer models was feasible within a regular engineering classroom.
Applying principles from safety science to improve child protection.
Cull, Michael J; Rzepnicki, Tina L; O'Day, Kathryn; Epstein, Richard A
2013-01-01
Child Protective Services Agencies (CPSAs) share many characteristics with other organizations operating in high-risk, high-profile industries. Over the past 50 years, industries as diverse as aviation, nuclear power, and healthcare have applied principles from safety science to improve practice. The current paper describes the rationale, characteristics, and challenges of applying concepts from the safety culture literature to CPSAs. Preliminary efforts to apply key principles aimed at improving child safety and well-being in two states are also presented.
Teaching Climate Social Science and Its Practices: A Two-Pronged Approach to Climate Literacy
NASA Astrophysics Data System (ADS)
Shwom, R.; Isenhour, C.; McCright, A.; Robinson, J.; Jordan, R.
2014-12-01
The Essential Principles of Climate Science Literacy states that a climate-literate individual can: "understand the essential principles of Earth's climate system, assess scientifically credible information about climate change, communicate about climate and climate change in a meaningful way, and make informed and responsible decisions with regard to actions that may affect climate." We argue that further integration of the social science dimensions of climate change will advance the climate literacy goals of communication and responsible actions. The underlying rationale for this argues: 1) teaching the habits of mind and scientific practices that have synergies across the social and natural sciences can strengthen students ability to understand and assess science in general and that 2) understanding the empirical research on the social, political, and economic processes (including climate science itself) that are part of the climate system is an important step for enabling effective action and communication. For example, while climate literacy has often identified the public's faulty mental models of climate processes as a partial explanation of complacency, emerging research suggests that the public's mental models of the social world are equally or more important in leading to informed and responsible climate decisions. Building student's ability to think across the social and natural sciences by understanding "how we know what we know" through the sciences and a scientific understanding of the social world allows us to achieve climate literacy goals more systematically and completely. To enable this integration we first identify the robust social science insights for the climate science literacy principles that involve social systems. We then briefly identify significant social science contributions to climate science literacy that do not clearly fit within the seven climate literacy principles but arguably could advance climate literacy goals. We conclude with suggestions on how the identified social science insights could be integrated into climate literacy efforts.
Making objective decisions in mechanical engineering problems
NASA Astrophysics Data System (ADS)
Raicu, A.; Oanta, E.; Sabau, A.
2017-08-01
Decision making process has a great influence in the development of a given project, the goal being to select an optimal choice in a given context. Because of its great importance, the decision making was studied using various science methods, finally being conceived the game theory that is considered the background for the science of logical decision making in various fields. The paper presents some basic ideas regarding the game theory in order to offer the necessary information to understand the multiple-criteria decision making (MCDM) problems in engineering. The solution is to transform the multiple-criteria problem in a one-criterion decision problem, using the notion of utility, together with the weighting sum model or the weighting product model. The weighted importance of the criteria is computed using the so-called Step method applied to a relation of preferences between the criteria. Two relevant examples from engineering are also presented. The future directions of research consist of the use of other types of criteria, the development of computer based instruments for decision making general problems and to conceive a software module based on expert system principles to be included in the Wiki software applications for polymeric materials that are already operational.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Guidelines for developing vectorizable computer programs
NASA Technical Reports Server (NTRS)
Miner, E. W.
1982-01-01
Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.
NASA Astrophysics Data System (ADS)
Lu, Deyu; Liu, Ping
2014-03-01
DFT+U method has been widely employed in theoretical studies on various ceria systems to correct the delocalization bias in local and semi-local DFT functionals with moderate computational cost. To rationalize the Hubbard U of Ce 4f, we employed the first principles linear response method to compute Hubbard U for Ce in ceria clusters, bulks, and surfaces. We found that in contrast to the commonly used approach treating U as a constant, the Hubbard U varies in a wide range from 4.1 eV to 6.7 eV, and exhibits a strong correlation with the Ce coordination numbers and Ce-O bond lengths, rather than the Ce 4f valence state. The variation of the Hubbard U can be explained by the changes in the strength of local screening due to O --> Ce intersite transition. Our study represents a systematic, quantitative investigation of the relationship between the Hubbard U and the local atomic arrangement, enabling a DFT+environment-dependent U scheme that can have potential impact on catalysis research of strongly correlated systems. This work is supported by the U.S. Department of Energy, Office of Basic Energy Sciences, under Contract No. DE-AC02-98CH10886.
Multiscale Simulations of Dynamics of Ferroelectric Domains
NASA Astrophysics Data System (ADS)
Liu, Shi
Ferroelectrics with switchable polarization have many important technological applications, which heavily rely on the interactions between the polarization and external perturbations. Understanding the dynamical response of ferroelectric materials is crucial for the discovery and development of new design principles and engineering strategies for optimized and breakthrough applications of ferroelectrics. We developed a multiscale computational approach that combines methods at different length and time scales to elucidate the connection between local structures, domain dynamics, and macroscopic finite-temperature properties of ferroelectrics. We started from first-principles calculations of ferroelectrics to build a model interatomic potential, enabling large-scale molecular dynamics (MD) simulations. The atomistic insights of nucleation and growth at the domain wall obtained from MD were then incorporated into a continuum model within the framework of Landau-Ginzburg-Devonshire theory. This progressive theoretical framework allows for the first time an efficient and accurate estimation of macroscopic properties such as the coercive field for a broad range of ferroelectrics from first-principles. This multiscale approach has also been applied to explore the effect of dipolar defects on ferroelectric switching and to understand the origin of giant electro-strain coupling. ONR, NSF, Carnegie Institution for Science.
Peterson, J P S; Sarthour, R S; Souza, A M; Oliveira, I S; Goold, J; Modi, K; Soares-Pinto, D O; Céleri, L C
2016-04-01
Landauer's principle sets fundamental thermodynamical constraints for classical and quantum information processing, thus affecting not only various branches of physics, but also of computer science and engineering. Despite its importance, this principle was only recently experimentally considered for classical systems. Here we employ a nuclear magnetic resonance set-up to experimentally address the information to energy conversion in a quantum system. Specifically, we consider a three nuclear spins [Formula: see text] (qubits) molecule-the system, the reservoir and the ancilla-to measure the heat dissipated during the implementation of a global system-reservoir unitary interaction that changes the information content of the system. By employing an interferometric technique, we were able to reconstruct the heat distribution associated with the unitary interaction. Then, through quantum state tomography, we measured the relative change in the entropy of the system. In this way, we were able to verify that an operation that changes the information content of the system must necessarily generate heat in the reservoir, exactly as predicted by Landauer's principle. The scheme presented here allows for the detailed study of irreversible entropy production in quantum information processors.
Emerging Nanophotonic Applications Explored with Advanced Scientific Parallel Computing
NASA Astrophysics Data System (ADS)
Meng, Xiang
The domain of nanoscale optical science and technology is a combination of the classical world of electromagnetics and the quantum mechanical regime of atoms and molecules. Recent advancements in fabrication technology allows the optical structures to be scaled down to nanoscale size or even to the atomic level, which are far smaller than the wavelength they are designed for. These nanostructures can have unique, controllable, and tunable optical properties and their interactions with quantum materials can have important near-field and far-field optical response. Undoubtedly, these optical properties can have many important applications, ranging from the efficient and tunable light sources, detectors, filters, modulators, high-speed all-optical switches; to the next-generation classical and quantum computation, and biophotonic medical sensors. This emerging research of nanoscience, known as nanophotonics, is a highly interdisciplinary field requiring expertise in materials science, physics, electrical engineering, and scientific computing, modeling and simulation. It has also become an important research field for investigating the science and engineering of light-matter interactions that take place on wavelength and subwavelength scales where the nature of the nanostructured matter controls the interactions. In addition, the fast advancements in the computing capabilities, such as parallel computing, also become as a critical element for investigating advanced nanophotonic devices. This role has taken on even greater urgency with the scale-down of device dimensions, and the design for these devices require extensive memory and extremely long core hours. Thus distributed computing platforms associated with parallel computing are required for faster designs processes. Scientific parallel computing constructs mathematical models and quantitative analysis techniques, and uses the computing machines to analyze and solve otherwise intractable scientific challenges. In particular, parallel computing are forms of computation operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently. In this dissertation, we report a series of new nanophotonic developments using the advanced parallel computing techniques. The applications include the structure optimizations at the nanoscale to control both the electromagnetic response of materials, and to manipulate nanoscale structures for enhanced field concentration, which enable breakthroughs in imaging, sensing systems (chapter 3 and 4) and improve the spatial-temporal resolutions of spectroscopies (chapter 5). We also report the investigations on the confinement study of optical-matter interactions at the quantum mechanical regime, where the size-dependent novel properties enhanced a wide range of technologies from the tunable and efficient light sources, detectors, to other nanophotonic elements with enhanced functionality (chapter 6 and 7).
Biomechanical Concepts for the Physical Educator
ERIC Educational Resources Information Center
Strohmeyer, H. Scott
2004-01-01
The concepts and principles of biomechanics are familiar to the teacher of physical science as well as to the physical educator. The difference between the two instructors, however, is that one knows the language of science and the other provides an experientially rich environment to support acquisition of these concepts and principles. Use of…
Does the Modality Principle for Multimedia Learning Apply to Science Classrooms?
ERIC Educational Resources Information Center
Harskamp, Egbert G.; Mayer, Richard E.; Suhre, Cor
2007-01-01
This study demonstrated that the modality principle applies to multimedia learning of regular science lessons in school settings. In the first field experiment, 27 Dutch secondary school students (age 16-17) received a self-paced, web-based multimedia lesson in biology. Students who received lessons containing illustrations and narration performed…
Presenting Science to the Public. The Professional Writing Series.
ERIC Educational Resources Information Center
Gastel, Barbara
This book introduces scientists, health professionals, and engineers to principles of communicating with the public and to practical aspects of dealing with the press. Part I focuses on the advantages, principles, and problems of communicating with the public. Part II discusses the nature of science reporting and offers advice for presenting…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... principles and science-based practices--for reducing the negative impacts of climate change on fish, wildlife... develop a draft Strategy. The adverse impacts of climate change transcend political and administrative... climate change. This Strategy will provide a unified approach--reflecting shared principles and science...
ERIC Educational Resources Information Center
Howe, Christine; Ilie, Sonia; Guardia, Paula; Hofmann, Riikka; Mercer, Neil; Riga, Fran
2015-01-01
In response to continuing concerns about student attainment and participation in science and mathematics, the "epiSTEMe" project took a novel approach to pedagogy in these two disciplines. Using principles identified as effective in the research literature (and combining these in a fashion not previously attempted), the project developed…
A narrative study of novice elementary teachers' perceptions of science instruction
NASA Astrophysics Data System (ADS)
Harrell, Roberta
It is hoped that, once implemented, the Next Generation Science Standards (NGSS) will engage students more deeply in science learning and build science knowledge sequentially beginning in Kindergarten (NRC, 2013). Early instruction is encouraged but must be delivered by qualified elementary teachers who have both the science content knowledge and the necessary instructional skills to teach science effectively to young children (Ejiwale, 2012, Spencer, Vogel, 2009, Walker, 2011). The purpose of this research study is to gain insight into novice elementary teachers' perceptions of science instruction. This research suggests that infusion of constructivist teaching in the elementary classroom is beneficial to the teacher's instruction of science concepts to elementary students. Constructivism is theory that learning is centered on the learner constructing new ideas or concepts built upon their current/past knowledge (Bruner, 1966). Based on this theory, it is recommended that the instructor should try to encourage students to discover principles independently; essentially the instructor presents the problem and lets students go (Good & Brophy, 2004). Discovery learning, hands-on, experimental, collaborative, and project-based learning are all approaches that use constructivist principles. The NGSS are based on constructivist principles. This narrative study provides insight into novice elementary teachers' perceptions of science instruction considered through the lens of Constructivist Theory (Bruner, 1960).
Kell, Douglas B
2006-03-01
The newly emerging field of systems biology involves a judicious interplay between high-throughput 'wet' experimentation, computational modelling and technology development, coupled to the world of ideas and theory. This interplay involves iterative cycles, such that systems biology is not at all confined to hypothesis-dependent studies, with intelligent, principled, hypothesis-generating studies being of high importance and consequently very far from aimless fishing expeditions. I seek to illustrate each of these facets. Novel technology development in metabolomics can increase substantially the dynamic range and number of metabolites that one can detect, and these can be exploited as disease markers and in the consequent and principled generation of hypotheses that are consistent with the data and achieve this in a value-free manner. Much of classical biochemistry and signalling pathway analysis has concentrated on the analyses of changes in the concentrations of intermediates, with 'local' equations - such as that of Michaelis and Menten v=(Vmax x S)/(S+K m) - that describe individual steps being based solely on the instantaneous values of these concentrations. Recent work using single cells (that are not subject to the intellectually unsupportable averaging of the variable displayed by heterogeneous cells possessing nonlinear kinetics) has led to the recognition that some protein signalling pathways may encode their signals not (just) as concentrations (AM or amplitude-modulated in a radio analogy) but via changes in the dynamics of those concentrations (the signals are FM or frequency-modulated). This contributes in principle to a straightforward solution of the crosstalk problem, leads to a profound reassessment of how to understand the downstream effects of dynamic changes in the concentrations of elements in these pathways, and stresses the role of signal processing (and not merely the intermediates) in biological signalling. It is this signal processing that lies at the heart of understanding the languages of cells. The resolution of many of the modern and postgenomic problems of biochemistry requires the development of a myriad of new technologies (and maybe a new culture), and thus regular input from the physical sciences, engineering, mathematics and computer science. One solution, that we are adopting in the Manchester Interdisciplinary Biocentre (http://www.mib.ac.uk/) and the Manchester Centre for Integrative Systems Biology (http://www.mcisb.org/), is thus to colocate individuals with the necessary combinations of skills. Novel disciplines that require such an integrative approach continue to emerge. These include fields such as chemical genomics, synthetic biology, distributed computational environments for biological data and modelling, single cell diagnostics/bionanotechnology, and computational linguistics/text mining.
ERIC Educational Resources Information Center
Jameson, A. Keith
Presented are the teacher's guide and student materials for one of a series of self-instructional, computer-based learning modules for an introductory, undergraduate chemistry course. The student manual for this unit on Le Chatelier's principle includes objectives, prerequisites, pretest, instructions for executing the computer program, and…
CADD medicine: design is the potion that can cure my disease
NASA Astrophysics Data System (ADS)
Manas, Eric S.; Green, Darren V. S.
2017-03-01
The acronym "CADD" is often used interchangeably to refer to "Computer Aided Drug Discovery" and "Computer Aided Drug Design". While the former definition implies the use of a computer to impact one or more aspects of discovering a drug, in this paper we contend that computational chemists are most effective when they enable teams to apply true design principles as they strive to create medicines to treat human disease. We argue that teams must bring to bear multiple sub-disciplines of computational chemistry in an integrated manner in order to utilize these principles to address the multi-objective nature of the drug discovery problem. Impact, resourcing principles, and future directions for the field are also discussed, including areas of future opportunity as well as a cautionary note about hype and hubris.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.
2013-12-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.
2014-01-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
NASA Astrophysics Data System (ADS)
Sonis, M.
Socio-ecological dynamics emerged from the field of Mathematical SocialSciences and opened up avenues for re-examination of classical problems of collective behavior in Social and Spatial sciences. The ``engine" of this collective behavior is the subjective mental evaluation of level of utilities in the future, presenting sets of composite socio-economic-temporal-locational advantages. These dynamics present new laws of collective multi-population behavior which are the meso-level counterparts of the utility optimization individual behavior. The central core of the socio-ecological choice dynamics includes the following first principle of the collective choice behavior of ``Homo Socialis" based on the existence of ``collective consciousness": the choice behavior of ``Homo Socialis" is a collective meso-level choice behavior such that the relative changes in choice frequencies depend on the distribution of innovation alternatives between adopters of innovations. The mathematical basis of the Socio-Ecological Dynamics includes two complementary analytical approaches both based on the use of computer modeling as a theoretical and simulation tool. First approach is the ``continuous approach" --- the systems of ordinary and partial differential equations reflecting the continuous time Volterra ecological formalism in a form of antagonistic and/or cooperative collective hyper-games between different sub-sets of choice alternatives. Second approach is the ``discrete approach" --- systems of difference equations presenting a new branch of the non-linear discrete dynamics --- the Discrete Relative m-population/n-innovations Socio-Spatial Dynamics (Dendrinos and Sonis, 1990). The generalization of the Volterra formalism leads further to the meso-level variational principle of collective choice behavior determining the balance between the resulting cumulative social spatio-temporal interactions among the population of adopters susceptible to the choice alternatives and the cumulative equalization of the power of elites supporting different choice alternatives. This balance governs the dynamic innovation choice process and constitutes the dynamic meso-level counterpart of the micro-economic individual utility maximization principle.
A Multidisciplined Teaching Reform of Biomaterials Course for Undergraduate Students
NASA Astrophysics Data System (ADS)
Li, Xiaoming; Zhao, Feng; Pu, Fang; Liu, Haifeng; Niu, Xufeng; Zhou, Gang; Li, Deyu; Fan, Yubo; Feng, Qingling; Cui, Fu-zhai; Watari, Fumio
2015-12-01
The biomaterials science has advanced in a high speed with global science and technology development during the recent decades, which experts predict to be more obvious in the near future with a more significant position for medicine and health care. Although the three traditional subjects, such as medical science, materials science and biology that act as a scaffold to support the structure of biomaterials science, are still essential for the research and education of biomaterials, other subjects, such as mechanical engineering, mechanics, computer science, automatic science, nanotechnology, and Bio-MEMS, are playing more and more important roles in the modern biomaterials science development. Thus, the research and education of modern biomaterials science should require a logical integration of the interdisciplinary science and technology, which not only concerns medical science, materials science and biology, but also includes other subjects that have been stated above. This article focuses on multidisciplinary nature of biomaterials, the awareness of which is currently lacking in the education at undergraduate stage. In order to meet this educational challenge, we presented a multidisciplinary course that referred to not only traditional sciences, but also frontier sciences and lasted for a whole academic year for senior biomaterials undergraduate students with principles of a better understanding of the modern biomaterials science and meeting the requirements of the future development in this area. The course has been shown to gain the recognition of the participants by questionaries and specific "before and after" comments and has also gained high recognition and persistent supports from our university. The idea of this course might be also fit for the education and construction of some other disciplines.
Quantitative prediction of solute strengthening in aluminium alloys.
Leyson, Gerard Paul M; Curtin, William A; Hector, Louis G; Woodward, Christopher F
2010-09-01
Despite significant advances in computational materials science, a quantitative, parameter-free prediction of the mechanical properties of alloys has been difficult to achieve from first principles. Here, we present a new analytic theory that, with input from first-principles calculations, is able to predict the strengthening of aluminium by substitutional solute atoms. Solute-dislocation interaction energies in and around the dislocation core are first calculated using density functional theory and a flexible-boundary-condition method. An analytic model for the strength, or stress to move a dislocation, owing to the random field of solutes, is then presented. The theory, which has no adjustable parameters and is extendable to other metallic alloys, predicts both the energy barriers to dislocation motion and the zero-temperature flow stress, allowing for predictions of finite-temperature flow stresses. Quantitative comparisons with experimental flow stresses at temperature T=78 K are made for Al-X alloys (X=Mg, Si, Cu, Cr) and good agreement is obtained.
A hybrid computational-experimental approach for automated crystal structure solution
NASA Astrophysics Data System (ADS)
Meredig, Bryce; Wolverton, C.
2013-02-01
Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.
JDFTx: Software for joint density-functional theory
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...
2017-11-14
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
Chersi, Fabian; Ferro, Marcello; Pezzulo, Giovanni; Pirrelli, Vito
2014-07-01
A growing body of evidence in cognitive psychology and neuroscience suggests a deep interconnection between sensory-motor and language systems in the brain. Based on recent neurophysiological findings on the anatomo-functional organization of the fronto-parietal network, we present a computational model showing that language processing may have reused or co-developed organizing principles, functionality, and learning mechanisms typical of premotor circuit. The proposed model combines principles of Hebbian topological self-organization and prediction learning. Trained on sequences of either motor or linguistic units, the network develops independent neuronal chains, formed by dedicated nodes encoding only context-specific stimuli. Moreover, neurons responding to the same stimulus or class of stimuli tend to cluster together to form topologically connected areas similar to those observed in the brain cortex. Simulations support a unitary explanatory framework reconciling neurophysiological motor data with established behavioral evidence on lexical acquisition, access, and recall. Copyright © 2014 Cognitive Science Society, Inc.
JDFTx: Software for joint density-functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
Embodied cognition for autonomous interactive robots.
Hoffman, Guy
2012-10-01
In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings. Copyright © 2012 Cognitive Science Society, Inc.
First-principles simulations of transition metal ions in silicon as potential quantum bits
NASA Astrophysics Data System (ADS)
Ma, He; Seo, Hosung; Galli, Giulia
Optically active spin defects in semiconductors have gained increasing attention in recent years for use as potential solid-state quantum bits (or qubits). Examples include the nitrogen-vacancy center in diamond, transition metal impurities, and rare earth ions. In this talk, we present first-principles theoretical results on group 6 transition metal ion (Chromium, Molybdenum and Tungsten) impurities in silicon, and we investigate their potential use as qubits. We used density functional theory (DFT) to calculate defect formation energies and we found that transition metal ions have lower formation energies at interstitial than substitutional sites. We also computed the electronic structure of the defects with particular attention to the position of the defect energy levels with respect to the silicon band edges. Based on our results, we will discuss the possibility of implementing qubits in silicon using group 6 transition metal ions. This work is supported by the National Science Foundation (NSF) through the University of Chicago MRSEC under Award Number DMR-1420709.
Efficient 3D kinetic Monte Carlo method for modeling of molecular structure and dynamics.
Panshenskov, Mikhail; Solov'yov, Ilia A; Solov'yov, Andrey V
2014-06-30
Self-assembly of molecular systems is an important and general problem that intertwines physics, chemistry, biology, and material sciences. Through understanding of the physical principles of self-organization, it often becomes feasible to control the process and to obtain complex structures with tailored properties, for example, bacteria colonies of cells or nanodevices with desired properties. Theoretical studies and simulations provide an important tool for unraveling the principles of self-organization and, therefore, have recently gained an increasing interest. The present article features an extension of a popular code MBN EXPLORER (MesoBioNano Explorer) aiming to provide a universal approach to study self-assembly phenomena in biology and nanoscience. In particular, this extension involves a highly parallelized module of MBN EXPLORER that allows simulating stochastic processes using the kinetic Monte Carlo approach in a three-dimensional space. We describe the computational side of the developed code, discuss its efficiency, and apply it for studying an exemplary system. Copyright © 2014 Wiley Periodicals, Inc.
The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system
Zerkin, V. V.; Pritychenko, B.
2018-02-04
The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ~22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented in this paper. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. Finally,more » it is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.« less
NASA Astrophysics Data System (ADS)
Gaitho, Francis M.; Mola, Genene T.; Pellicane, Giuseppe
2018-02-01
Organic solar cells have the ability to transform solar energy efficiently and have a promising energy balance. Producing these cells is economical and makes use of methods of printing using inks built on solvents that are well-matched with a variety of cheap materials like flexible plastic or paper. The primary materials used to manufacture organic solar cells include carbon-based semiconductors, which are good light absorbers and efficient charge generators. In this article, we review previous research of interest based on morphology of polymer blends used in bulk heterojunction (BHJ) solar cells and introduce their basic principles. We further review computational models used in the analysis of surface behavior of polymer blends in BHJ as well as the trends in the field of polymer surface science as applied to BHJ photovoltaics. We also give in brief, the opportunities and challenges in the area of polymer blends on BHJ organic solar cells.
The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system
NASA Astrophysics Data System (ADS)
Zerkin, V. V.; Pritychenko, B.
2018-04-01
The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ∼22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. It is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.
Learning general phonological rules from distributional information: a computational model.
Calamaro, Shira; Jarosz, Gaja
2015-04-01
Phonological rules create alternations in the phonetic realizations of related words. These rules must be learned by infants in order to identify the phonological inventory, the morphological structure, and the lexicon of a language. Recent work proposes a computational model for the learning of one kind of phonological alternation, allophony (Peperkamp, Le Calvez, Nadal, & Dupoux, 2006). This paper extends the model to account for learning of a broader set of phonological alternations and the formalization of these alternations as general rules. In Experiment 1, we apply the original model to new data in Dutch and demonstrate its limitations in learning nonallophonic rules. In Experiment 2, we extend the model to allow it to learn general rules for alternations that apply to a class of segments. In Experiment 3, the model is further extended to allow for generalization by context; we argue that this generalization must be constrained by linguistic principles. Copyright © 2014 Cognitive Science Society, Inc.
Bassett, Danielle S; Sporns, Olaf
2017-01-01
Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844
The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zerkin, V. V.; Pritychenko, B.
The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ~22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented in this paper. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. Finally,more » it is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.« less
How Climate Science got to be in the Next Generation Science Standards (Invited)
NASA Astrophysics Data System (ADS)
Wysession, M. E.
2013-12-01
Climate science plays a prominent role in the new national K-12 Next Generation Science Standards (NGSS). This represents the culmination of a significant amount of effort by many different organizations that have worked hard to educate the public on one of the most interesting, complex, complicated, and societally important aspects of geoscience. While there are significant challenges to the full implementation of the NGSS, especially those aspects that relate to climate change, the fact that so many states are currently adopting the NGSS represents a significant milestone in geoscience education. When grade 6-12 textbooks were written ten years ago, such as Pearson's high school Physical Science: Concepts in Action (Wysession et al., 2004), very little mention of climate change was incorporated because it did not appear in state standards. Now, climate and climate change are an integral part of the middle school and high school NGSS standards, and textbook companies are fully incorporating this content into their programs. There are many factors that have helped the shift toward teaching about climate, such as the IPCC report, Al Gore's 'An Inconvenient Truth,' and the many reports on climate change published by the National Research Council (NRC). However, four major community-driven literacy documents (The Essential Principles of Ocean Science, Essential Principles and Fundamental Concepts for Atmospheric Science Literacy, The Earth Science Literacy Principles, and The Essential Principles of Climate Science) were essential in that they directly informed the construction of the Earth and Space Science (ESS) content of the NRC's 'Framework for K-12 Science Education' by the ESS Design Team. The actual performance expectations of the NGSS were then informed directly by the disciplinary core ideas of the NRC Framework, which were motivated by the community-driven literacy documents and the significant credentials these bore. The work in getting climate science into classrooms has just begun: having standards that address climate science does not ensure that it will reach students. However, the fact that climate science plays an important role in the nation's first attempt at a national K-12 science program represents a significant advancement.
How Climate Science got to be in the Next Generation Science Standards (Invited)
NASA Astrophysics Data System (ADS)
Westnedge, K. L.; Dallimore, A.; Salish Sea Expedition Team
2011-12-01
Climate science plays a prominent role in the new national K-12 Next Generation Science Standards (NGSS). This represents the culmination of a significant amount of effort by many different organizations that have worked hard to educate the public on one of the most interesting, complex, complicated, and societally important aspects of geoscience. While there are significant challenges to the full implementation of the NGSS, especially those aspects that relate to climate change, the fact that so many states are currently adopting the NGSS represents a significant milestone in geoscience education. When grade 6-12 textbooks were written ten years ago, such as Pearson's high school Physical Science: Concepts in Action (Wysession et al., 2004), very little mention of climate change was incorporated because it did not appear in state standards. Now, climate and climate change are an integral part of the middle school and high school NGSS standards, and textbook companies are fully incorporating this content into their programs. There are many factors that have helped the shift toward teaching about climate, such as the IPCC report, Al Gore's 'An Inconvenient Truth,' and the many reports on climate change published by the National Research Council (NRC). However, four major community-driven literacy documents (The Essential Principles of Ocean Science, Essential Principles and Fundamental Concepts for Atmospheric Science Literacy, The Earth Science Literacy Principles, and The Essential Principles of Climate Science) were essential in that they directly informed the construction of the Earth and Space Science (ESS) content of the NRC's 'Framework for K-12 Science Education' by the ESS Design Team. The actual performance expectations of the NGSS were then informed directly by the disciplinary core ideas of the NRC Framework, which were motivated by the community-driven literacy documents and the significant credentials these bore. The work in getting climate science into classrooms has just begun: having standards that address climate science does not ensure that it will reach students. However, the fact that climate science plays an important role in the nation's first attempt at a national K-12 science program represents a significant advancement.
Engineering and physical sciences in oncology: challenges and opportunities
Mitchell, Michael J.; Jain, Rakesh K.; Langer, Robert
2017-01-01
The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas. PMID:29026204
ERIC Educational Resources Information Center
Gabrielson, Curtis A.; Hsi, Sherry
2012-01-01
This paper articulates and illustrates design principles that guided the development of a set of hands-on teaching activities for the national science and mathematics curricula at junior-high and high-school level education in Timor-Leste, a small, low-income nation in Southeast Asia. A partnership between a university, an international science…
Pre-Service Science Teachers' Perception of the Principles of Scientific Research
ERIC Educational Resources Information Center
Can, Sendil; Kaymakci, Güliz
2016-01-01
The purpose of the current study employing the survey method is to determine the pre-service science teachers' perceptions of the principles of scientific research and to investigate the effects of gender, grade level and the state of following scientific publications on their perceptions. The sampling of the current research is comprised of 125…
ERIC Educational Resources Information Center
Slater, Timothy F.; Jones, Lauren V.
2004-01-01
This project explores the effectiveness of learner-centered education (LCE) principles and practices on student learning and attitudes in an online interactive introductory astronomy course for non-science majors by comparing a high-quality Internet-delivered course with a high-quality on-campus course, both of which are based on the principles of…
ERIC Educational Resources Information Center
Cheng, Ming-Chang; Chou, Pei-I; Wang, Ya-Ting; Lin, Chih-Ho
2015-01-01
This study investigates how the illustrations in a science textbook, with their design modified according to cognitive process principles, affected students' learning performance. The quasi-experimental design recruited two Grade 5 groups (N?=?58) as the research participants. The treatment group (n?=?30) used the modified version of the textbook,…
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2012 CFR
2012-01-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2011 CFR
2011-01-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2010 CFR
2010-07-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2014 CFR
2014-01-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
41 CFR Appendix A to Subpart E of... - 3-Key Points and Principles
Code of Federal Regulations, 2013 CFR
2013-07-01
... Academy of Sciences or the National Academy of Public Administration? Pt. 102-3, Subpt. E, App. A Appendix... principles Section(s) Question(s) Guidance I. Section 15 of the Act allows the National Academy of Sciences... these circumstances, neither the existence of the funding agreement nor the fact that it contemplates...
ERIC Educational Resources Information Center
Flener-Lovitt, Charity
2014-01-01
A thematic course called "Climate Change: Chemistry and Controversy" was developed for upper-level non-STEM students. This course used the socioscientific context of climate change to teach chemical principles and the nature of science. Students used principles of agnotology (direct study of misinformation) to debunk climate change…
Three Principles to Improve Outcomes for Children and Families. Science to Policy and Practice
ERIC Educational Resources Information Center
Cohen, Steven D.
2017-01-01
The science of child development and the core capabilities of adults point to a set of "design principles" that policymakers and practitioners in many different sectors can use to improve outcomes for children and families. That is, to be maximally effective, policies and services should: (1) support responsive relationships for children…
ERIC Educational Resources Information Center
Halpin, Myra J.; Hoeffler, Leanne; Schwartz-Bloom, Rochelle D.
2005-01-01
To help students learn science concepts, Pharmacology Education Partnership (PEP)--a science education program that incorporates relevant topics related to drugs and drug abuse into standard biology and chemistry curricula was developed. The interdisciplinary PEP curriculum provides six modules to teach biology and chemistry principles within the…
ERIC Educational Resources Information Center
Seeman, Jeffrey I.
2005-01-01
The chemical and physical properties of nicotine and its carboxylic acid salts found in tobacco provided as an interesting example to understand basic principles of complex science. The result showed that the experimental data used were inconsistent to the conclusion made, and the transfer of nicotine smoke from tobacco to smoke cannot be…
Science and technology convergence: with emphasis for nanotechnology-inspired convergence
NASA Astrophysics Data System (ADS)
Bainbridge, William S.; Roco, Mihail C.
2016-07-01
Convergence offers a new universe of discovery, innovation, and application opportunities through specific theories, principles, and methods to be implemented in research, education, production, and other societal activities. Using a holistic approach with shared goals, convergence seeks to transcend existing human limitations to achieve improved conditions for work, learning, aging, physical, and cognitive wellness. This paper outlines ten key theories that offer complementary perspectives on this complex dynamic. Principles and methods are proposed to facilitate and enhance science and technology convergence. Several convergence success stories in the first part of the 21st century—including nanotechnology and other emerging technologies—are discussed in parallel with case studies focused on the future. The formulation of relevant theories, principles, and methods aims at establishing the convergence science.
The Psychology of Close Relationships: Fourteen Core Principles.
Finkel, Eli J; Simpson, Jeffry A; Eastwick, Paul W
2017-01-03
Relationship science is a theory-rich discipline, but there have been no attempts to articulate the broader themes or principles that cut across the theories themselves. We have sought to fill that void by reviewing the psychological literature on close relationships, particularly romantic relationships, to extract its core principles. This review reveals 14 principles, which collectively address four central questions: (a) What is a relationship? (b) How do relationships operate? (c) What tendencies do people bring to their relationships? (d) How does the context affect relationships? The 14 principles paint a cohesive and unified picture of romantic relationships that reflects a strong and maturing discipline. However, the principles afford few of the sorts of conflicting predictions that can be especially helpful in fostering novel theory development. We conclude that relationship science is likely to benefit from simultaneous pushes toward both greater integration across theories (to reduce redundancy) and greater emphasis on the circumstances under which existing (or not-yet-developed) principles conflict with one another.
The Role of Metaphysical Naturalism in Science
NASA Astrophysics Data System (ADS)
Mahner, Martin
2012-10-01
This paper defends the view that metaphysical naturalism is a constitutive ontological principle of science in that the general empirical methods of science, such as observation, measurement and experiment, and thus the very production of empirical evidence, presuppose a no-supernature principle. It examines the consequences of metaphysical naturalism for the testability of supernatural claims, and it argues that explanations involving supernatural entities are pseudo-explanatory due to the many semantic and ontological problems of supernatural concepts. The paper also addresses the controversy about metaphysical versus methodological naturalism.
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Gamification: What It Is and Why It Matters to Digital Health Behavior Change Developers
2013-01-01
This editorial provides a behavioral science view on gamification and health behavior change, describes its principles and mechanisms, and reviews some of the evidence for its efficacy. Furthermore, this editorial explores the relation between gamification and behavior change frameworks used in the health sciences and shows how gamification principles are closely related to principles that have been proven to work in health behavior change technology. Finally, this editorial provides criteria that can be used to assess when gamification provides a potentially promising framework for digital health interventions. PMID:25658754
Gamification: what it is and why it matters to digital health behavior change developers.
Cugelman, Brian
2013-12-12
This editorial provides a behavioral science view on gamification and health behavior change, describes its principles and mechanisms, and reviews some of the evidence for its efficacy. Furthermore, this editorial explores the relation between gamification and behavior change frameworks used in the health sciences and shows how gamification principles are closely related to principles that have been proven to work in health behavior change technology. Finally, this editorial provides criteria that can be used to assess when gamification provides a potentially promising framework for digital health interventions.
Hayes, A Wallace
2005-06-01
The Precautionary Principle in its simplest form states: "When an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause-and-effect relationships are not fully established scientifically". This Principle is the basis for European environmental law, and plays an increasing role in developing environmental health policies as well. It also is used in environmental decision-making in Canada and in several European countries, especially in Denmark, Sweden, and Germany. The Precautionary Principle has been used in the environmental decision-making process and in regulating drugs and other consumer products in the United States. The Precautionary Principle enhances the collection of risk information for, among other items, high production volume chemicals and risk-based analyses in general. It does not eliminate the need for good science or for science-based risk assessments. Public participation is encouraged in both the review process and the decision-making process. The Precautionary Principle encourages, and in some cases may require, transparency of the risk assessment process on health risk of chemicals both for public health and the environment. A debate continues on whether the Principle should embrace the "polluter pays" directive and place the responsibility for providing risk assessment on industry. The best elements of a precautionary approach demand good science and challenge the scientific community to improve methods used for risk assessment.
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
Quantum Gauss-Jordan Elimination and Simulation of Accounting Principles on Quantum Computers
NASA Astrophysics Data System (ADS)
Diep, Do Ngoc; Giang, Do Hoang; Van Minh, Nguyen
2017-06-01
The paper is devoted to a version of Quantum Gauss-Jordan Elimination and its applications. In the first part, we construct the Quantum Gauss-Jordan Elimination (QGJE) Algorithm and estimate the complexity of computation of Reduced Row Echelon Form (RREF) of N × N matrices. The main result asserts that QGJE has computation time is of order 2 N/2. The second part is devoted to a new idea of simulation of accounting by quantum computing. We first expose the actual accounting principles in a pure mathematics language. Then, we simulate the accounting principles on quantum computers. We show that, all accounting actions are exhousted by the described basic actions. The main problems of accounting are reduced to some system of linear equations in the economic model of Leontief. In this simulation, we use our constructed Quantum Gauss-Jordan Elimination to solve the problems and the complexity of quantum computing is a square root order faster than the complexity in classical computing.
Computational principles of working memory in sentence comprehension.
Lewis, Richard L; Vasishth, Shravan; Van Dyke, Julie A
2006-10-01
Understanding a sentence requires a working memory of the partial products of comprehension, so that linguistic relations between temporally distal parts of the sentence can be rapidly computed. We describe an emerging theoretical framework for this working memory system that incorporates several independently motivated principles of memory: a sharply limited attentional focus, rapid retrieval of item (but not order) information subject to interference from similar items, and activation decay (forgetting over time). A computational model embodying these principles provides an explanation of the functional capacities and severe limitations of human processing, as well as accounts of reading times. The broad implication is that the detailed nature of cross-linguistic sentence processing emerges from the interaction of general principles of human memory with the specialized task of language comprehension.
NASA Astrophysics Data System (ADS)
Tilley, Richard J. D.
2003-05-01
Colour is an important and integral part of everyday life, and an understanding and knowledge of the scientific principles behind colour, with its many applications and uses, is becoming increasingly important to a wide range of academic disciplines, from physical, medical and biological sciences through to the arts. Colour and the Optical Properties of Materials carefully introduces the science behind the subject, along with many modern and cutting-edge applications, chose to appeal to today's students. For science students, it provides a broad introduction to the subject and the many applications of colour. To more applied students, such as engineering and arts students, it provides the essential scientific background to colour and the many applications. Features: * Introduces the science behind the subject whilst closely connecting it to modern applications, such as colour displays, optical amplifiers and colour centre lasers * Richly illustrated with full-colour plates * Includes many worked examples, along with problems and exercises at the end of each chapter and selected answers at the back of the book * A Web site, including additional problems and full solutions to all the problems, which may be accessed at: www.cardiff.ac.uk/uwcc/engin/staff/rdjt/colour Written for students taking an introductory course in colour in a wide range of disciplines such as physics, chemistry, engineering, materials science, computer science, design, photography, architecture and textiles.
ERIC Educational Resources Information Center
Trifonas, Peter
2003-01-01
The principle of reason "as principle of grounding, foundation or institution" has tended to guide the science of research toward techno-practical ends. From this epistemic superintendence of the terms of knowledge and inquiry, there has arisen the traditional notion of academic responsibility that is tied to the pursuit of truth via a conception…
ERIC Educational Resources Information Center
Ruthven, Kenneth; Mercer, Neil; Taber, Keith S.; Guardia, Paula; Hofmann, Riikka; Ilie, Sonia; Luthman, Stefanie; Riga, Fran
2017-01-01
The "Effecting Principled Improvement in STEM Education" ["epiSTEMe"] project undertook pedagogical research aimed at improving pupil engagement and learning in early secondary school physical science and mathematics. Using principles identified as effective in the research literature and drawing on a range of existing…
Six Increasingly Higher Levels of Wellness Based on Holistic Principles and Risk Factor Science.
ERIC Educational Resources Information Center
Cassel, Russell N.
1987-01-01
Describes program for achievement of higher wellness levels based on holistic principles and risk factor science. Levels focus on (1) heart disease risk factors and how to reverse them; (2) unconscious needs at conflict with one's conscious goals; (3) identity status, meaning to love and to be loved; (4) autogenics; and (5) full ego development…
Content vs. Learning: An Old Dichotomy in Science Courses
ERIC Educational Resources Information Center
Bergtrom, Gerald
2011-01-01
The principles of course redesign that were applied to a gateway Cell Biology course at the University of Wisconsin-Milwaukee are applicable to courses large and small, and to institutions of any size. The challenge was to design a content-rich science course that kept pace with present and future content and at the same time use principles of…
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
NASA Astrophysics Data System (ADS)
Cook, J.
2016-12-01
MOOCs (Massive Open Online Courses) are a powerful tool, making educational content available to a large and diverse audience. The MOOC "Making Sense of Climate Science Denial" applied science communication principles derived from cognitive psychology and misconception-based learning in the design of video lectures covering many aspects of climate change. As well as teaching fundamental climate science, the course also presented psychological research into climate science denial, teaching students the most effective techniques for responding to misinformation. A number of enrolled students were secondary and tertiary educators, who adopted the course content in their own classes as well as adapted their teaching techniques based on the science communication principles presented in the lectures. I will outline how we integrated cognitive psychology, educational research and climate science in an interdisciplinary online course that has had over 25,000 enrolments from over 160 countries.
"One hundred percent efficiency": Technology and the pursuit of scientific literacy
NASA Astrophysics Data System (ADS)
King, Kenneth Paul
This dissertation examined the role of technology in science education during the twentieth century. A historical approach was taken to examine teacher practices in the use of technology. The three technologies considered in this study were the motion picture, the television, and the computer. As an organizing principle, historical definitions of "scientific literacy" were used to examine the goals of using technology within science education. The evolution of the concept of science literacy is traced from the early part of the twentieth century to the late 1990s. Documentation examined revealed the "best practices" associated with the use of technology. The use of the motion picture was traced from the silent film through film loops, videotape, videodisc and the advent of the digital video disc, and the means by which teachers used this technology were considered. The instructional use of television was examined from several different approaches: commercial broadcasts, educational and instructional programming, closed circuit approaches and the use of cable and satellite programming. The manner in which these approaches were used to achieve goals of scientific literacy was considered. The use of the computer was examined in terms of the purpose of the software involved. Teaching practice to achieve scientific literacy, using computers as a means of accessing information, as an analytical tool, as a creativity tool, and as a means of communication were addressed. In each of these technologies, similar implementation trends were present within each one. The literature supporting the use of the technology described first the focus on the hardware, followed by the development of appropriate pedagogy, and then by the proliferation of software supporting the use of the technology. Suggestions for additional study were offered as well as speculation as to future practices with technology in science teaching. Investigations using expectation-value theory suggest particular promise with regard to staff development needs among teachers using technology. The convergence of the various technologies into a single entity represents one likely scenario for the use of technology within science teaching. Further developments with telecommunications may provide simple and direct delivery systems for national and/or state curricula.
NASA Astrophysics Data System (ADS)
Furuya, Haruhisa; Hiratsuka, Mitsuyoshi
This article overviews the historical transition of legal protection of Computer software contracts in the Unite States and presents how it should function under Uniform Commercial Code and its amended Article 2B, Uniform Computer Information Transactions Act, and also recently-approved “Principles of the Law of Software Contracts”.
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
Deep hierarchies in the primate visual cortex: what can we learn for computer vision?
Krüger, Norbert; Janssen, Peter; Kalkan, Sinan; Lappe, Markus; Leonardis, Ales; Piater, Justus; Rodríguez-Sánchez, Antonio J; Wiskott, Laurenz
2013-08-01
Computational modeling of the primate visual system yields insights of potential relevance to some of the challenges that computer vision is facing, such as object recognition and categorization, motion detection and activity recognition, or vision-based navigation and manipulation. This paper reviews some functional principles and structures that are generally thought to underlie the primate visual cortex, and attempts to extract biological principles that could further advance computer vision research. Organized for a computer vision audience, we present functional principles of the processing hierarchies present in the primate visual system considering recent discoveries in neurophysiology. The hierarchical processing in the primate visual system is characterized by a sequence of different levels of processing (on the order of 10) that constitute a deep hierarchy in contrast to the flat vision architectures predominantly used in today's mainstream computer vision. We hope that the functional description of the deep hierarchies realized in the primate visual system provides valuable insights for the design of computer vision algorithms, fostering increasingly productive interaction between biological and computer vision research.
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
76 FR 50759 - National Science Board; Sunshine Act Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-16
... NATIONAL SCIENCE FOUNDATION National Science Board; Sunshine Act Meeting The National Science Board's Task Force on Merit Review, pursuant to NSF regulations (45 CFR Part 614), the National Science....T. SUBJECT MATTER: Discussion of proposed revisions to the draft principles and review criteria...
NASA Astrophysics Data System (ADS)
Zhuo, Zhao; Cai, Shi-Min; Tang, Ming; Lai, Ying-Cheng
2018-04-01
One of the most challenging problems in network science is to accurately detect communities at distinct hierarchical scales. Most existing methods are based on structural analysis and manipulation, which are NP-hard. We articulate an alternative, dynamical evolution-based approach to the problem. The basic principle is to computationally implement a nonlinear dynamical process on all nodes in the network with a general coupling scheme, creating a networked dynamical system. Under a proper system setting and with an adjustable control parameter, the community structure of the network would "come out" or emerge naturally from the dynamical evolution of the system. As the control parameter is systematically varied, the community hierarchies at different scales can be revealed. As a concrete example of this general principle, we exploit clustered synchronization as a dynamical mechanism through which the hierarchical community structure can be uncovered. In particular, for quite arbitrary choices of the nonlinear nodal dynamics and coupling scheme, decreasing the coupling parameter from the global synchronization regime, in which the dynamical states of all nodes are perfectly synchronized, can lead to a weaker type of synchronization organized as clusters. We demonstrate the existence of optimal choices of the coupling parameter for which the synchronization clusters encode accurate information about the hierarchical community structure of the network. We test and validate our method using a standard class of benchmark modular networks with two distinct hierarchies of communities and a number of empirical networks arising from the real world. Our method is computationally extremely efficient, eliminating completely the NP-hard difficulty associated with previous methods. The basic principle of exploiting dynamical evolution to uncover hidden community organizations at different scales represents a "game-change" type of approach to addressing the problem of community detection in complex networks.
The Value of Methodical Management: Optimizing Science Results
NASA Astrophysics Data System (ADS)
Saby, Linnea
2016-01-01
As science progresses, making new discoveries in radio astronomy becomes increasingly complex. Instrumentation must be incredibly fine-tuned and well-understood, scientists must consider the skills and schedules of large research teams, and inter-organizational projects sometimes require coordination between observatories around the globe. Structured and methodical management allows scientists to work more effectively in this environment and leads to optimal science output. This report outlines the principles of methodical project management in general, and describes how those principles are applied at the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia.
Data-driven Applications for the Sun-Earth System
NASA Astrophysics Data System (ADS)
Kondrashov, D. A.
2016-12-01
Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, Daniel S; Jha, Shantenu; Weissman, Jon
2017-01-31
This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperablemore » distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weissman, Jon; Katz, Dan; Jha, Shantenu
2017-01-31
This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable andmore » interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less
The Schizophrenic Theme in Science Fiction
1965-06-01
science fiction, that certain themes such as super powers, telepathy , being influenced by external agencies, conspiracy, etc., bear only a...prohibition. This war of the instinctual drives (pleasure principle) with the reality principle was manifested in very aspect of the personality. Dreams , F. D...2. Somniomorph: Literary productions depicting events of such a nature and connected in such a manner as they typically occur in dreams . Much of the
A Financial Technology Entrepreneurship Program for Computer Science Students
ERIC Educational Resources Information Center
Lawler, James P.; Joseph, Anthony
2011-01-01
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
Designing integrated computational biology pipelines visually.
Jamil, Hasan M
2013-01-01
The long-term cost of developing and maintaining a computational pipeline that depends upon data integration and sophisticated workflow logic is too high to even contemplate "what if" or ad hoc type queries. In this paper, we introduce a novel application building interface for computational biology research, called VizBuilder, by leveraging a recent query language called BioFlow for life sciences databases. Using VizBuilder, it is now possible to develop ad hoc complex computational biology applications at throw away costs. The underlying query language supports data integration and workflow construction almost transparently and fully automatically, using a best effort approach. Users express their application by drawing it with VizBuilder icons and connecting them in a meaningful way. Completed applications are compiled and translated as BioFlow queries for execution by the data management system LifeDB, for which VizBuilder serves as a front end. We discuss VizBuilder features and functionalities in the context of a real life application after we briefly introduce BioFlow. The architecture and design principles of VizBuilder are also discussed. Finally, we outline future extensions of VizBuilder. To our knowledge, VizBuilder is a unique system that allows visually designing computational biology pipelines involving distributed and heterogeneous resources in an ad hoc manner.
ERIC Educational Resources Information Center
Menekse, Muhsin
2015-01-01
While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…
NASA Technical Reports Server (NTRS)
Charles, H. K. Jr; Beck, T. J.; Feldmesser, H. S.; Magee, T. C.; Spisz, T. S.; Pisacane, V. L.
2001-01-01
An advanced, multiple projection, dual energy x-ray absorptiometry (AMPDXA) scanner system is under development. The AMPDXA is designed to make precision bone and muscle loss measurements necessary to determine the deleterious effects of microgravity on astronauts as well as develop countermeasures to stem their bone and muscle loss. To date, a full size test system has been developed to verify principles and the results of computer simulations. Results indicate that accurate predictions of bone mechanical properties can be determined from as few as three projections, while more projections are needed for a complete, three-dimensional reconstruction. c 2001. Elsevier Science Ltd. All rights reserved.
Impact of a process improvement program in a production software environment: Are we any better?
NASA Technical Reports Server (NTRS)
Heller, Gerard H.; Page, Gerald T.
1990-01-01
For the past 15 years, Computer Sciences Corporation (CSC) has participated in a process improvement program as a member of the Software Engineering Laboratory (SEL), which is sponsored by GSFC. The benefits CSC has derived from involvement in this program are analyzed. In the environment studied, it shows that improvements were indeed achieved, as evidenced by a decrease in error rates and costs over a period in which both the size and the complexity of the developed systems increased substantially. The principles and mechanics of the process improvement program, the lessons CSC has learned, and how CSC has capitalized on these lessons are also discussed.
1998-08-07
cognitive flexibility theory and generative learning theory which focus primarily on the individual student’s cognitive development , collaborative... develop "Handling Transfusion Hazards," a computer program based upon cognitive flexibility theory principles. The Program: Handling Transfusion Hazards...computer program was developed according to cognitive flexibility theory principles. A generative version was then developed by embedding
Theme: Physical Science in Agriscience--The New Ag Mech.
ERIC Educational Resources Information Center
Buriak, Phil; And Others
1992-01-01
Seven theme articles discuss strategies for teaching mechanics, physical sciences in the study of foods, scientific principles in the agricultural curriculum, environmental issues in agriculture, and applied physical sciences. (SK)
From academic to applied: Operationalising resilience in river systems
NASA Astrophysics Data System (ADS)
Parsons, Melissa; Thoms, Martin C.
2018-03-01
The concept of resilience acknowledges the ability of societies to live and develop with dynamic environments. Given the recognition of the need to prepare for anticipated and unanticipated shocks, applications of resilience are increasing as the guiding principle of public policy and programs in areas such as disaster management, urban planning, natural resource management, and climate change adaptation. River science is an area in which the adoption of resilience is increasing, leading to the proposition that resilience may become a guiding principle of river policy and programs. Debate about the role of resilience in rivers is part of the scientific method, but disciplinary disunity about the ways to approach resilience application in policy and programs may leave river science out of the policy process. We propose six elements that need to be considered in the design and implementation of resilience-based river policy and programs: rivers as social-ecological systems; the science-policy interface; principles, capacities, and characteristics of resilience; cogeneration of knowledge; adaptive management; and the state of the science of resilience.
ERIC Educational Resources Information Center
Freudenrich, Craig C.
2000-01-01
Recommends using science fiction television episodes, novels, and films for teaching science and motivating students. Studies Newton's Law of Motion, principles of relativity, journey to Mars, interplanetary trajectories, artificial gravity, and Martian geology. Discusses science fiction's ability to capture student interest and the advantages of…
Local Production: Principles and Practice
ERIC Educational Resources Information Center
Whittell, J. M. S.
1975-01-01
Presents the problems of Third World countries in acquiring science equipment to augment their science curriculum development plans. Outlines an attempt by Kenya Science Teachers College to produce and supply science equipment. Describes the approach to production, quality control, and costing and sales. (GS)
Long-time atomistic simulations with the Parallel Replica Dynamics method
NASA Astrophysics Data System (ADS)
Perez, Danny
Molecular Dynamics (MD) -- the numerical integration of atomistic equations of motion -- is a workhorse of computational materials science. Indeed, MD can in principle be used to obtain any thermodynamic or kinetic quantity, without introducing any approximation or assumptions beyond the adequacy of the interaction potential. It is therefore an extremely powerful and flexible tool to study materials with atomistic spatio-temporal resolution. These enviable qualities however come at a steep computational price, hence limiting the system sizes and simulation times that can be achieved in practice. While the size limitation can be efficiently addressed with massively parallel implementations of MD based on spatial decomposition strategies, allowing for the simulation of trillions of atoms, the same approach usually cannot extend the timescales much beyond microseconds. In this article, we discuss an alternative parallel-in-time approach, the Parallel Replica Dynamics (ParRep) method, that aims at addressing the timescale limitation of MD for systems that evolve through rare state-to-state transitions. We review the formal underpinnings of the method and demonstrate that it can provide arbitrarily accurate results for any definition of the states. When an adequate definition of the states is available, ParRep can simulate trajectories with a parallel speedup approaching the number of replicas used. We demonstrate the usefulness of ParRep by presenting different examples of materials simulations where access to long timescales was essential to access the physical regime of interest and discuss practical considerations that must be addressed to carry out these simulations. Work supported by the United States Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Qiuxia; Wang, Jianguo; Wang, Yang-Gang
The effects of structure and size on the selectivity of catalytic furfural conversion over supported Pt catalysts in the presence of hydrogen have been studied using first principles density functional theory (DFT) calculations and microkinetic modeling. Four Pt model systems, i.e., periodic Pt(111), Pt(211) surfaces, as well as small nanoclusters (Pt13 and Pt55) are chosen to represent the terrace, step, and corner sites of Pt nanoparticles. Our DFT results show that the reaction routes for furfural hydrogenation and decarbonylation are strongly dependent on the type of reactive sites, which lead to the different selectivity. On the basis of the size-dependentmore » site distribution rule, we correlate the site distributions as a function of the Pt particle size. Our microkinetic results indicate the critical particle size that controls the furfural selectivity is about 1.0 nm, which is in good agreement with the reported experimental value under reaction conditions. This work was supported by National Basic Research Program of China (973 Program) (2013CB733501) and the National Natural Science Foundation of China (NSFC-21306169, 21176221, 21136001, 21101137 and 91334103). This work was also partially supported by the US Department of Energy (DOE), the Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less
A Monte Carlo Simulation of Brownian Motion in the Freshman Laboratory
ERIC Educational Resources Information Center
Anger, C. D.; Prescott, J. R.
1970-01-01
Describes a dry- lab" experiment for the college freshman laboratory, in which the essential features of Browian motion are given principles, using the Monte Carlo technique. Calculations principles, using the Monte Carlo technique. Calculations are carried out by a computation sheme based on computer language. Bibliography. (LC)
Using Multimedia for E-Learning
ERIC Educational Resources Information Center
Mayer, R. E.
2017-01-01
This paper reviews 12 research-based principles for how to design computer-based multimedia instructional materials to promote academic learning, starting with the multimedia principle (yielding a median effect size of d = 1.67 based on five experimental comparisons), which holds that people learn better from computer-based instruction containing…
Computer Science | Classification | College of Engineering & Applied
EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer
Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?
ERIC Educational Resources Information Center
Schrock, John Richard
1984-01-01
Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…
Selin, Cynthia; Rawlings, Kelly Campbell; de Ridder-Vignone, Kathryn; Sadowski, Jathan; Altamirano Allende, Carlo; Gano, Gretchen; Davies, Sarah R; Guston, David H
2017-08-01
Public engagement with science and technology is now widely used in science policy and communication. Touted as a means of enhancing democratic discussion of science and technology, analysis of public engagement with science and technology has shown that it is often weakly tied to scientific governance. In this article, we suggest that the notion of capacity building might be a way of reframing the democratic potential of public engagement with science and technology activities. Drawing on literatures from public policy and administration, we outline how public engagement with science and technology might build citizen capacity, before using the notion of capacity building to develop five principles for the design of public engagement with science and technology. We demonstrate the use of these principles through a discussion of the development and realization of the pilot for a large-scale public engagement with science and technology activity, the Futurescape City Tours, which was carried out in Arizona in 2012.
Lessons from a broad view of science: a response to Dr Robergs’ article
Pires, Flavio Oliveira
2018-01-01
Dr Robergs suggested that the central governor model (CGM) is not a well-worded theory, as it deviated from the tenant of falsification criteria. According to his view of science, exercise researches with the intent to prove rather than disprove the theory contribute little to new knowledge and condemn the theory to the label of pseudoscience. However, exercise scientists should be aware of limitations of the falsification criteria. First, the number of potential falsifiers for a given hypothesis is always infinite so that there is no mean to ensure asymmetric comparison between theories. Thus, assuming a competition between CGM and dichotomised central versus peripheral fatigue theories, scientists guided by the falsification principle should know, a priori, all possible falsifiers between these two theories in order to choose the finest one, thereby leading to an oversimplification of the theories. Second, the failure to formulate refutable hypothesis may be a simple consequence of the lack of instruments to make crucial measurements. The use of refutation principles to test the CGM theory requires capable technology for online feedback and feedforward measures integrated in the central nervous system, in a real-time exercise. Consequently, falsification principle is currently impracticable to test CGM theory. The falsification principle must be applied with equilibrium, as we should do with positive induction process, otherwise Popperian philosophy will be incompatible with the actual practice in science. Rather than driving the scientific debate on a biased single view of science, researchers in the field of exercise sciences may benefit more from different views of science. PMID:29629188
Application of Cognitive Science Principles: Instructional Heuristics and Mechanisms for Use.
ERIC Educational Resources Information Center
Montague, William E.
Cognitive science is briefly reviewed, and its implications for instructional design are discussed. The application of cognitive science to instruction requires knowledge of cognitive science, the subject content taught, and the system in which the instruction is imbedded. The central concept of cognitive science is mental representation--the…
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
78 FR 10180 - Annual Computational Science Symposium; Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
Principles of effective USA federal fire management plans
Meyer, Marc D.; Roberts, Susan L.; Wills, Robin; Brooks, Matthew L.; Winford, Eric M.
2015-01-01
Federal fire management plans are essential implementation guides for the management of wildland fire on federal lands. Recent changes in federal fire policy implementation guidance and fire science information suggest the need for substantial changes in federal fire management plans of the United States. Federal land management agencies are also undergoing land management planning efforts that will initiate revision of fire management plans across the country. Using the southern Sierra Nevada as a case study, we briefly describe the underlying framework of fire management plans, assess their consistency with guiding principles based on current science information and federal policy guidance, and provide recommendations for the development of future fire management plans. Based on our review, we recommend that future fire management plans be: (1) consistent and compatible, (2) collaborative, (3) clear and comprehensive, (4) spatially and temporally scalable, (5) informed by the best available science, and (6) flexible and adaptive. In addition, we identify and describe several strategic guides or “tools” that can enhance these core principles and benefit future fire management plans in the following areas: planning and prioritization, science integration, climate change adaptation, partnerships, monitoring, education and communication, and applied fire management. These principles and tools are essential to successfully realize fire management goals and objectives in a rapidly changing world.
1993-03-25
application of Object-Oriented Programming (OOP) and Human-Computer Interface (HCI) design principles. Knowledge gained from each topic has been incorporated...through the ap- plication of Object-Oriented Programming (OOP) and Human-Computer Interface (HCI) design principles. Knowledge gained from each topic has...programming and Human-Computer Interface (HCI) design. Knowledge gained from each is applied to the design of a Form-based interface for database data
Top 20 Psychological Principles for PK-12 Education
ERIC Educational Resources Information Center
Lucariello, Joan M.; Nastasi, Bonnie K.; Dwyer, Carol; Skiba, Russell; DeMarie, Darlene; Anderman, Eric M.
2016-01-01
This article describes an initiative undertaken by a coalition of psychologists (Coalition for Psychology in Schools and Education) from the American Psychological Association (APA) to identify the top 20 principles from psychological science relevant to teaching and learning in the classroom. This article identifies these principles and their…
The Jet Principle: Technologies Provide Border Conditions for Global Learning
ERIC Educational Resources Information Center
Ahamer, Gilbert
2012-01-01
Purpose: The purpose of this paper is to first define the "jet principle" of (e-)learning as providing dynamically suitable framework conditions for enhanced learning procedures that combine views from multiple cultures of science. Second it applies this principle to the case of the "Global Studies" curriculum, a unique…
Cognitive Science Implications for Enhancing Training Effectiveness in a Serious Gaming Context
ERIC Educational Resources Information Center
Greitzer, Frank L.; Kuchar, Olga Anna; Huston, Kristy
2007-01-01
Serious games use entertainment principles, creativity, and technology to meet government or corporate training objectives, but these principles alone will not guarantee that the intended learning will occur. To be effective, serious games must incorporate sound cognitive, learning, and pedagogical principles into their design and structure. In…
ERIC Educational Resources Information Center
Howe, Christine; Luthman, Stefanie; Ruthven, Kenneth; Mercer, Neil; Hofmann, Riikka; Ilie, Sonia; Guardia, Paula
2015-01-01
Reflecting concerns about student attainment and participation in mathematics and science, the Effecting Principled Improvement in STEM Education ("epiSTEMe") project attempted to support pedagogical advancement in these two disciplines. Using principles identified as effective in the research literature (and combining these in a novel…
Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations
ERIC Educational Resources Information Center
Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa
2013-01-01
The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nasseh, Bizhan
Ball State University (BSU) was the recipient of a U.S. Department of Energy award to develop educational games teaching science and math. The Science Media Program will merge Ball State University’s nationally recognized capabilities in education, technology, and communication to develop new, interactive, game-based media for the teaching and learning of science and scientific principles for K-12 students. BSU established a team of educators, researchers, scientists, animators, designers, technology specialists, and hired a professional media developer company (Outside Source Design) from Indianapolis. After six months discussions and assessments the project team selected the following 8 games in Math, Physics, Chemistry,more » and Biology, 2 from each discipline. The assembled teams were innovative and unique. This new model of development and production included a process that integrated all needed knowledge and expertise for the development of high quality science and math games for K-12 students. This new model has potential to be used by others for the development of the educational games. The uniqueness of the model is to integrate domain experts’ knowledge with researchers/quality control group, and combine a professional development team from the game development company with the academic game development team from Computer Science and Art departments at Ball State University. The developed games went through feasibility tests with selected students for improvement before use in the research activities.« less
Innovative approach towards understanding optics
NASA Astrophysics Data System (ADS)
Garg, Amit; Bharadwaj, Sadashiv Raj; Kumar, Raj; Shudhanshu, Avinash Kumar; Verma, Deepak Kumar
2016-01-01
Over the last few years, there has been a decline in the students’ interest towards Science and Optics. Use of technology in the form of various types of sensors and data acquisition systems has come as a saviour. Till date, manual routine tools and techniques are used to perform various experimental procedures in most of the science/optics laboratories in our country. The manual tools are cumbersome whereas the automated ones are costly. It does not enthuse young researchers towards the science laboratories. There is a need to develop applications which can be easily integrated, tailored at school and undergraduate level laboratories and are economical at the same time. Equipments with advanced technologies are available but they are uneconomical and have complicated working principle with a black box approach. The present work describes development of portable tools and applications which are user-friendly. This is being implemented using open-source physical computing platform based on a simple low cost microcontroller board and a development environment for writing software. The present paper reports the development of an automated spectrometer, an instrument used in almost all optics experiments at undergraduate level, and students’ response to this innovation. These tools will inspire young researchers towards science and facilitate development of advance low cost equipments making life easier for Indian as well as developing nations.
Tools for open geospatial science
NASA Astrophysics Data System (ADS)
Petras, V.; Petrasova, A.; Mitasova, H.
2017-12-01
Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.
High school computer science education paves the way for higher education: the Israeli case
NASA Astrophysics Data System (ADS)
Armoni, Michal; Gal-Ezer, Judith
2014-07-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mei, Donghai
2013-05-20
Molecular adsorption of formate and carboxyl on the stoichiometric CeO2(111) and CeO2(110) surfaces was studied using periodic density functional theory (DFT+U) calculations. Two distinguishable adsorption modes (strong and weak) of formate are identified. The bidentate configuration is more stable than the monodentate adsorption configuration. Both formate and carboxyl bind at the more open CeO2(110) surface are stronger. The calculated vibrational frequencies of two adsorbed species are consistent with experimental measurements. Finally, the effects of U parameters on the adsorption of formate and carboxyl over both CeO2 surfaces were investigated. We found that the geometrical configurations of two adsorbed species aremore » not affected by using different U parameters (U=0, 5, and 7). However, the calculated adsorption energy of carboxyl pronouncedly increases with the U value while the adsorption energy of formate only slightly changes (<0.2 eV). The Bader charge analysis shows the opposite charge transfer occurs for formate and carboxyl adsorption where the adsorbed formate is negatively charge whiled the adsorbed carboxyl is positively charged. Interestingly, with the increasing U parameter, the amount of charge is also increased. This work was supported by the Laboratory Directed Research and Development (LDRD) project of the Pacific Northwest National Laboratory (PNNL) and by a Cooperative Research and Development Agreement (CRADA) with General Motors. The computations were performed using the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), which is a U.S. Department of Energy national scientific user facility located at PNNL in Richland, Washington. Part of the computing time was also granted by the National Energy Research Scientific Computing Center (NERSC)« less
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
NASA Astrophysics Data System (ADS)
Britton, Lynda A.
1998-12-01
Exploration of meaningful learning of the polymerase chain reaction (PCR) followed instruction by a researcher-developed hypermedia computer program that incorporated human constructivist principles and a "science-in-fiction" chapter of a novel that described PCR. Human constructivism is the Ausubel-Novak-Gowin (1997) meaningful learning theory that supports science learning through graphic representations and multiple examples. Science-in-fiction is a new genre of fiction introduced by the prominent scientist, Carl Djerassi, to engender an appreciation for science, and its ethical dilemmas. Chapter 19 of Djerassi's 1994 novel, The Bourbaki Gambit, was placed into hypermedia format to standardize the presentation. As part of a clinical microbiology course in the medical technology curriculum at a major medical center in the Deep South, 10 undergraduates participated in this study. Each first read The Bourbaki Gambit, and then half of the participants experienced the human constructivist approach first (the PCR group) while the others first encountered the science-in-fiction approach (the Chapter 19 group). For the rest, the order of presentation was reversed, so that all experienced both programs. Students' explanations while using the computer were videotaped. Students were tested and interviewed before experiencing either program, after their first instructional session, and again after the second instructional session. These students were also assessed on their knowledge of the nature of science by taking the Nature of Science Questionnaire, before and after instruction (Roach, 1993) and interviewed as a cross-check on its reliability. Students' preferred learning approaches were determined using Schmeck's Inventory of Learning Processes (Schmeck, Ribich, & Ramanaiah, 1977). Data were collected and analyzed both qualitatively and quantitatively using appropriate verbal analysis techniques (Chi, 1997). All but three students reached a structural level of PCR biological literacy. A mean of 79% of the concepts identified as necessary was attained by participants after experiencing both approaches. The Chapter 19 science-in-fiction group scored slightly better than those who experienced the PCR program first, indicating that the chapter served as an advance organizer when used first, but inhibited mastery when used second. Significant conceptual change about the nature of science was not detected, even though most students demonstrated deep and/or elaborative learning styles.
The Case of the Soft-Shelled Egg.
ERIC Educational Resources Information Center
Cocanour, Barbara; Bruce, Alease S.
1986-01-01
Offers suggestions for activities that demonstrate the principles of osmosis. Explains how decalcified chicken eggs can be used to give students practice with measurement, experimental procedures, and science principles. (ML)
ShunLi Shang; Louis G. Hector Jr.; Paul Saxe; Zi-Kui Liu; Robert J. Moon; Pablo D. Zavattieri
2014-01-01
Anisotropy and temperature dependence of structural, thermodynamic and elastic properties of crystalline cellulose Iβ were computed with first-principles density functional theory (DFT) and a semi-empirical correction for van der Waals interactions. Specifically, we report the computed temperature variation (up to 500...
An Undergraduate Course on Operating Systems Principles.
ERIC Educational Resources Information Center
National Academy of Engineering, Washington, DC. Commission on Education.
This report is from Task Force VIII of the COSINE Committee of the Commission on Education of the National Academy of Engineering. The task force was established to formulate subject matter for an elective undergraduate subject on computer operating systems principles for students whose major interest is in the engineering of computer systems and…
"Citizen Jane": Rethinking Design Principles for Closing the Gender Gap in Computing.
ERIC Educational Resources Information Center
Raphael, Chad
This paper identifies three rationales in the relevant literature for closing the gender gap in computing: economic, cultural and political. Each rationale implies a different set of indicators of present inequalities, disparate goals for creating equality, and distinct principles for software and web site design that aims to help girls overcome…
Extinction from a Rationalist Perspective
Gallistel, C. R.
2012-01-01
The merging of the computational theory of mind and evolutionary thinking leads to a kind of rationalism, in which enduring truths about the world have become implicit in the computations that enable the brain to cope with the experienced world. The dead reckoning computation, for example, is implemented within the brains of animals as one of the mechanisms that enables them to learn where they are (Gallistel, 1990, 1995). It integrates a velocity signal with respect to a time signal. Thus, the manner in which position and velocity relate to one another in the world is reflected in the manner in which signals representing those variables are processed in the brain. I use principles of information theory and Bayesian inference to derive from other simple principles explanations for: 1) the failure of partial reinforcement to increase reinforcements to acquisition; 2) the partial reinforcement extinction effect; 3) spontaneous recovery; 4) renewal; 5) reinstatement; 6) resurgence (aka facilitated reacquisition). Like the principle underlying dead-reckoning, these principles are grounded in analytic considerations. They are the kind of enduring truths about the world that are likely to have shaped the brain's computations. PMID:22391153
NASA Astrophysics Data System (ADS)
Hypolite, Christine Collins
The purpose of this research was to determine how an inquiry-based, whole-plant instructional strategy would affect preservice elementary teachers' understanding of plant science principles. This study probed: what preservice teachers know about plant biology concepts before and after instruction, their views of the interrelatedness of plant parts and the environment, how growing a plant affects preservice teachers' understanding, and which types of activity-rich plant themes studies, if any, affect preservice elementary teachers' understandings. The participants in the study were enrolled in two elementary science methods class sections at a state university. Each group was administered a preinstructional test at the beginning of the study. The treatment group participated in inquiry-based activities related to the Principles of Plant Biology (American Society of Plant Biologists, 2001), while the comparison group studied those same concepts through traditional instructional methods. A focus group was formed from the treatment group to participate in co-concept mapping sessions. The participants' understandings were assessed through artifacts from activities, a comparison of pre- and postinstructional tests, and the concept maps generated by the focus group. Results of the research indicated that the whole-plant, inquiry-based instructional strategy can be applied to teach preservice elementary teachers plant biology while modeling the human constructivist approach. The results further indicated that this approach enhanced their understanding of plant science content knowledge, as well as pedagogical knowledge. The results also showed that a whole-plant approach to teaching plant science concepts is an instructional strategy that is feasible for the elementary school. The theoretical framework for this study was Human Constructivist learning theory (Mintzes & Wandersee, 1998). The content knowledge and instructional strategy was informed by the Principles of Plant Biology (American Society of Plant Biologists, 2001) and Botany for the Next Millennium (Botanical Society of America, 1995). As a result of this study, a better understanding of the factors that influence preservice elementary teachers' knowledge of plant science principles may benefit elementary science educator in preparing teachers that are "highly qualified."
. Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Bernier, David
2011-01-01
Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
Forensic facial comparison in South Africa: State of the science.
Steyn, M; Pretorius, M; Briers, N; Bacci, N; Johnson, A; Houlton, T M R
2018-06-01
Forensic facial comparison (FFC) is a scientific technique used to link suspects to a crime scene based on the analysis of photos or video recordings from that scene. While basic guidelines on practice and training are provided by the Facial Identification Scientific Working Group, details of how these are applied across the world are scarce. FFC is frequently used in South Africa, with more than 700 comparisons conducted in the last two years alone. In this paper the standards of practice are outlined, with new proposed levels of agreement/conclusions. We outline three levels of training that were established, with training in facial anatomy, terminology, principles of image comparison, image science, facial recognition and computer skills being aimed at developing general competency. Training in generating court charts and understanding court case proceedings are being specifically developed for the South African context. Various shortcomings still exist, specifically with regard to knowledge of the reliability of the technique. These need to be addressed in future research. Copyright © 2018 Elsevier B.V. All rights reserved.
The precautionary principle in environmental science.
Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M
2001-01-01
Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy. PMID:11673114
Professional Ethics in Astronomy: The AAS Ethics Statement
NASA Astrophysics Data System (ADS)
Marvel, Kevin B.
2013-01-01
It is fundamental to the advancement of science that practicing scientists adhere to a consistent set of professional ethical principles. Recent violations of these principles have led a decreased trust in the process of science and scientific results. Although astronomy is less in the spotlight on these issues than medical science or climate change research, it is still incumbent on the field to follow sound scientific process guided by basic ethical guidelines. The American Astronomical Society, developed a set of such guidelines in 2010. This contribution summarizes the motivation and process by which the AAS Ethics Statement was produced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
Visualizations and Mental Models - The Educational Implications of GEOWALL
NASA Astrophysics Data System (ADS)
Rapp, D.; Kendeou, P.
2003-12-01
Work in the earth sciences has outlined many of the faulty beliefs that students possess concerning particular geological systems and processes. Evidence from educational and cognitive psychology has demonstrated that students often have difficulty overcoming their na‹ve beliefs about science. Prior knowledge is often remarkably resistant to change, particularly when students' existing mental models for geological principles may be faulty or inaccurate. Figuring out how to help students revise their mental models to include appropriate information is a major challenge. Up until this point, research has tended to focus on whether 2-dimensional computer visualizations are useful tools for helping students develop scientifically correct models. Research suggests that when students are given the opportunity to use dynamic computer-based visualizations, they are more likely to recall the learned information, and are more likely to transfer that knowledge to novel settings. Unfortunately, 2-dimensional visualization systems are often inadequate representations of the material that educators would like students to learn. For example, a 2-dimensional image of the Earth's surface does not adequately convey particular features that are critical for visualizing the geological environment. This may limit the models that students can construct following these visualizations. GEOWALL is a stereo projection system that has attempted to address this issue. It can display multidimensional static geologic images and dynamic geologic animations in a 3-dimensional format. Our current research examines whether multidimensional visualization systems such as GEOWALL may facilitate learning by helping students to develop more complex mental models. This talk will address some of the cognitive issues that influence the construction of mental models, and the difficulty of updating existing mental models. We will also discuss our current work that seeks to examine whether GEOWALL is an effective tool for helping students to learn geological information (and potentially restructure their na‹ve conceptions of geologic principles).
NASA Astrophysics Data System (ADS)
Koch, Melissa; Gorges, Torie
2016-10-01
Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.
The NASA computer science research program plan
NASA Technical Reports Server (NTRS)
1983-01-01
A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
Hobbes on natural philosophy as "True Physics" and mixed mathematics.
Adams, Marcus P
2016-04-01
In this paper, I offer an alternative account of the relationship of Hobbesian geometry to natural philosophy by arguing that mixed mathematics provided Hobbes with a model for thinking about it. In mixed mathematics, one may borrow causal principles from one science and use them in another science without there being a deductive relationship between those two sciences. Natural philosophy for Hobbes is mixed because an explanation may combine observations from experience (the 'that') with causal principles from geometry (the 'why'). My argument shows that Hobbesian natural philosophy relies upon suppositions that bodies plausibly behave according to these borrowed causal principles from geometry, acknowledging that bodies in the world may not actually behave this way. First, I consider Hobbes's relation to Aristotelian mixed mathematics and to Isaac Barrow's broadening of mixed mathematics in Mathematical Lectures (1683). I show that for Hobbes maker's knowledge from geometry provides the 'why' in mixed-mathematical explanations. Next, I examine two explanations from De corpore Part IV: (1) the explanation of sense in De corpore 25.1-2; and (2) the explanation of the swelling of parts of the body when they become warm in De corpore 27.3. In both explanations, I show Hobbes borrowing and citing geometrical principles and mixing these principles with appeals to experience. Copyright © 2015 Elsevier Ltd. All rights reserved.
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
1988-07-08
Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36
Empirical Determination of Competence Areas to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia
2014-01-01
The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…
Factors Influencing Exemplary Science Teachers' Levels of Computer Use
ERIC Educational Resources Information Center
Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen
2011-01-01
The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
Life Science Standards and Curriculum Development for 9-12.
ERIC Educational Resources Information Center
Speece, Susan P.; Andersen, Hans O.
1996-01-01
Proposes a design for a life science curriculum following the National Research Council National Science Education Standards. The overarching theme is that science as inquiry should be recognized as a basic and controlling principle in the ultimate organization and experiences in students' science education. Six-week units include Matter, Energy,…
OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2005-01-01
Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
ETHICS AND JUSTICE IN ENVIRONMENTAL SCIENCE AND ENGINEERING
Science and engineering are built on trust. C.P. Snow's famous quote, "the only ethical principle which has made science possible is that the truth shall be told all the time" underscores the importance of honesty in science. Environmental scientists must do work that is useful...
ERIC Educational Resources Information Center
School Science Review, 1989
1989-01-01
Twenty-two activities are presented. Topics include: acid rain, microcomputers, fish farming, school-industry research projects, enzymes, equilibrium, assessment, science equipment, logic, Archimedes principle, electronics, optics, and statistics. (CW)
A Principle for Network Science
2011-02-01
we consider is the sound of splashing water from a leaky faucet . This sequence of water drops can set your teeth on edge and leads to tossing and...intermittent sequence of water drops from a leaky faucet is described by a Lévy stable distribution that is an asymptotically inverse power-law with index...universality of physics: the conservation of energy, symmetry principles, and the laws of thermodynamics have no analogs in the soft sciences. This
Principles of Instruction: Research-Based Strategies That All Teachers Should Know
ERIC Educational Resources Information Center
Rosenshine, Barak
2012-01-01
This article presents 10 research-based principles of instruction, along with suggestions for classroom practice. These principles come from three sources: (a) research in cognitive science, (b) research on master teachers, and (c) research on cognitive supports. Each is briefly explained in this article. Even though these are three very different…
ERIC Educational Resources Information Center
Magnavita, Jeffrey J.
2006-01-01
The search for the principles of unified psychotherapy is an important stage in the advancement of the field. Converging evidence from various streams of clinical science allows the identification of some of the major domains of human functioning, adaptation, and dysfunction. These principles, supported by animal modeling, neuroscience, and…
Morrison, Geoffrey Stewart
2014-05-01
In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.
Charlton, Bruce G
2008-01-01
Crick and Watson gave complementary advice to the aspiring scientist based on the insight that to do your best work you need to make your greatest possible effort. Crick made the positive suggestion to work on the subject which most deeply interests you, the thing about which you spontaneously gossip - Crick termed this 'the gossip test'. Watson made the negative suggestion of avoiding topics and activities that bore you - which I have termed 'the boredom principle'. This is good advice because science is tough and the easy things have already been done. Solving the harder problems that remain requires a lot of effort. But in modern biomedical science individual effort does not necessarily correlate with career success as measured by salary, status, job security, etc. This is because Crick and Watson are talking about revolutionary science - using Thomas Kuhn's distinction between paradigm-shifting 'revolutionary' science and incremental 'normal' science. There are two main problems with pursuing a career in revolutionary science. The first is that revolutionary science is intrinsically riskier than normal science, the second that even revolutionary success in a scientific backwater may be less career-enhancing than mundane work in a trendy field. So, if you pick your scientific problem using the gossip test and the boredom principle, you might also be committing career suicide. This may explain why so few people follow Crick and Watson's advice. The best hope for future biomedical science is that it will evolve towards a greater convergence between individual effort and career success.
Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender
NASA Astrophysics Data System (ADS)
Larsen, Elizabeth A.; Stubbs, Margaret L.
Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.
Democratizing Computer Science
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Ryoo, Jean J.
2015-01-01
Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…
ERIC Educational Resources Information Center
Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu
2013-01-01
With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…
Charting a Course for Precision Oncology
Kusnezov, Dimitri; Paragas, Jason
2017-02-09
Here, the fields of science have undergone dramatic reorganizations as they have come to terms with the realities of the growing complexities of their problem set, the costs, and the breadth of skills needed to make major progress. A field such as particle physics transformed from principal investigator-driven research supported by an electron synchrotron in the basement of your physics building in the 1950s, to regional centers when costs became prohibitive to refresh technology everywhere, driving larger teams of scientists to cooperate in the 1970s, to international centers where multinational teams work together to achieve progress. The 2013 Nobel Prizemore » winning discovery of the Higgs boson would have been unlikely without such team science. Other fields such as the computational sciences are well on their way through such a transformation. Today, we see precision medicine as a field that will need to come to terms with new organizational principles in order to make major progress, including everyone from individual medical researchers to pharma. Interestingly, the Cancer Moonshot has helped move thinking in that direction for part of the community and now the initiative has been transformed into law.« less
Charting a Course for Precision Oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kusnezov, Dimitri; Paragas, Jason
Here, the fields of science have undergone dramatic reorganizations as they have come to terms with the realities of the growing complexities of their problem set, the costs, and the breadth of skills needed to make major progress. A field such as particle physics transformed from principal investigator-driven research supported by an electron synchrotron in the basement of your physics building in the 1950s, to regional centers when costs became prohibitive to refresh technology everywhere, driving larger teams of scientists to cooperate in the 1970s, to international centers where multinational teams work together to achieve progress. The 2013 Nobel Prizemore » winning discovery of the Higgs boson would have been unlikely without such team science. Other fields such as the computational sciences are well on their way through such a transformation. Today, we see precision medicine as a field that will need to come to terms with new organizational principles in order to make major progress, including everyone from individual medical researchers to pharma. Interestingly, the Cancer Moonshot has helped move thinking in that direction for part of the community and now the initiative has been transformed into law.« less
Role of oxygen diffusion at Ni/Cr2O3 interface in intergranular oxidation of Ni-Cr alloy
NASA Astrophysics Data System (ADS)
Medasani, Bharat; Sushko, Maria; Schreiber, Daniel; Rosso, Kevin; Bruemmer, Stephen
Certain Ni-Cr alloys used in nuclear systems experience intergranular oxidation and stress corrosion cracking when exposed to high-temperature water leading to their degradation and unexpected failure. To develop a mechanistic understanding of grain boundary oxidation processes, we proposed a mesoscale metal alloy oxidation model that combines quantum Density Functional Theory (DFT) with mesoscopic Poisson-Nernst-Planck/classical DFT. This framework encompasses the chemical specificity of elementary diffusion processes and mesoscale reactive dynamics, and allows modeling oxidation processes on experimentally relevant length scales from first principles. As a proof of concept, a preliminary model was previously employed that limited oxygen diffusion pathways to those through the oxide phase and did not allow oxygen diffusion in the alloy or across oxide/alloy interfaces. In this work, we expand the model to include oxygen diffusion pathways along Ni/Cr2O3 interfaces and demonstrate the increasing importance of such pathways for intergranular oxidation of Ni-Cr alloys with high Cr content. This work is supported by the U.S. Dept. of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division. Simulations are performed using PNNL Institutional Computing facility.
NASA Astrophysics Data System (ADS)
Helbing, D.; Bishop, S.; Conte, R.; Lukowicz, P.; McCarthy, J. B.
2012-11-01
We have built particle accelerators to understand the forces that make up our physical world. Yet, we do not understand the principles underlying our strongly connected, techno-socio-economic systems. We have enabled ubiquitous Internet connectivity and instant, global information access. Yet we do not understand how it impacts our behavior and the evolution of society. To fill the knowledge gaps and keep up with the fast pace at which our world is changing, a Knowledge Accelerator must urgently be created. The financial crisis, international wars, global terror, the spreading of diseases and cyber-crime as well as demographic, technological and environmental change demonstrate that humanity is facing serious challenges. These problems cannot be solved within the traditional paradigms. Moving our attention from a component-oriented view of the world to an interaction-oriented view will allow us to understand the complex systems we have created and the emergent collective phenomena characterising them. This paradigm shift will enable new solutions to long-standing problems, very much as the shift from a geocentric to a heliocentric worldview has facilitated modern physics and the ability to launch satellites. The FuturICT flagship project will develop new science and technology to manage our future in a complex, strongly connected world. For this, it will combine the power of information and communication technology (ICT) with knowledge from the social and complexity sciences. ICT will provide the data to boost the social sciences into a new era. Complexity science will shed new light on the emergent phenomena in socially interactive systems, and the social sciences will provide a better understanding of the opportunities and risks of strongly networked systems, in particular future ICT systems. Hence, the envisaged FuturICT flagship will create new methods and instruments to tackle the challenges of the 21st century. FuturICT could indeed become one of the most important scientific endeavours ever, by revealing the principles that make socially interactive systems work well, by inspiring the creation of new platforms to explore our possible futures, and by initiating an era of social and socio-inspired innovations.
Computer Science and the Liberal Arts
ERIC Educational Resources Information Center
Shannon, Christine
2010-01-01
Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…
Marrying Content and Process in Computer Science Education
ERIC Educational Resources Information Center
Zendler, A.; Spannagel, C.; Klaudt, D.
2011-01-01
Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…
ERIC Educational Resources Information Center
Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.
2016-01-01
Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…
Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University
ERIC Educational Resources Information Center
Plane, Jandelyn
2010-01-01
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…
Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.
ERIC Educational Resources Information Center
Turner, Judith Axler
1987-01-01
Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)
African-American males in computer science---Examining the pipeline for clogs
NASA Astrophysics Data System (ADS)
Stone, Daryl Bryant
The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.
A Bridge for Accelerating Materials by Design
Sumpter, Bobby G.; Vasudevan, Rama K.; Potok, Thomas E.; ...
2015-11-25
Recent technical advances in the area of nanoscale imaging, spectroscopy, and scattering/diffraction have led to unprecedented capabilities for investigating materials structural, dynamical and functional characteristics. In addition, recent advances in computational algorithms and computer capacities that are orders of magnitude larger/faster have enabled large-scale simulations of materials properties starting with nothing but the identity of the atomic species and the basic principles of quantum- and statistical-mechanics and thermodynamics. Along with these advances, an explosion of high-resolution data has emerged. This confluence of capabilities and rise of big data offer grand opportunities for advancing materials sciences but also introduce several challenges.more » In this editorial we identify challenges impeding progress towards advancing materials by design (e.g., the design/discovery of materials with improved properties/performance), possible solutions, and provide examples of scientific issues that can be addressed by using a tightly integrated approach where theory and experiments are linked through big-deep data.« less
[Computerization and robotics in medical practice].
Dervaderics, J
1997-10-26
The article gives the outlines of all principles used in computing included the non-electrical and analog computers and the artifical intelligence followed by citing examples as well. The principles and medical utilization of virtual reality are also mentioned. There are discussed: surgical planning, image guided surgery, robotic surgery, telepresence and telesurgery, and telemedicine implemented partially via Internet.
Some Principles for the Human Use of Computers in Education.
ERIC Educational Resources Information Center
Dwyer, Thomas A.
Several principles for the effective use of computers in education are identified as a result of experiences with Project Solo, an experiment in education patterned on the dual-solo example of flight instruction in allowing the student to eventually exert more influence on his learning than his instructor. First, the essential social character of…
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
EDITORIAL: Computational materials science Computational materials science
NASA Astrophysics Data System (ADS)
Kahl, Gerhard; Kresse, Georg
2011-10-01
Special issue in honour of Jürgen Hafner On 30 September 2010, Jürgen Hafner, one of the most prominent and influential members within the solid state community, retired. His remarkably broad scientific oeuvre has made him one of the founding fathers of modern computational materials science: more than 600 scientific publications, numerous contributions to books, and a highly cited monograph, which has become a standard reference in the theory of metals, witness not only the remarkable productivity of Jürgen Hafner but also his impact in theoretical solid state physics. In an effort to duly acknowledge Jürgen Hafner's lasting impact in this field, a Festsymposium was held on 27-29 September 2010 at the Universität Wien. The organizers of this symposium (and authors of this editorial) are proud to say that a large number of highly renowned scientists in theoretical condensed matter theory—co-workers, friends and students—accepted the invitation to this celebration of Hafner's jubilee. Some of these speakers also followed our invitation to submit their contribution to this Festschrift, published in Journal of Physics: Condensed Matter, a journal which Jürgen Hafner served in 2000-2003 and 2003-2006 as a member of the Advisory Editorial Board and member of the Executive Board, respectively. In the subsequent article, Volker Heine, friend and co-worker of Jürgen Hafner over many decades, gives an account of Hafner's impact in the field of theoretical condensed matter physics. Computational materials science contents Theoretical study of structural, mechanical and spectroscopic properties of boehmite (γ-AlOOH) D Tunega, H Pašalić, M H Gerzabek and H Lischka Ethylene epoxidation catalyzed by chlorine-promoted silver oxide M O Ozbek, I Onal and R A Van Santen First-principles study of Cu2ZnSnS4 and the related band offsets for photovoltaic applicationsA Nagoya, R Asahi and G Kresse Renormalization group study of random quantum magnetsIstván A Kovács and Ferenc Iglói Ordering effects in disordered systems: the Au-Si systemN Jakse, T L T Nguyen and A Pasturel On the stability of Archimedean tilings formed by patchy particlesMoritz Antlanger, Günther Doppelbauer and Gerhard Kahl
Designing Serious Game Interventions for Individuals with Autism.
Whyte, Elisabeth M; Smyth, Joshua M; Scherf, K Suzanne
2015-12-01
The design of "Serious games" that use game components (e.g., storyline, long-term goals, rewards) to create engaging learning experiences has increased in recent years. We examine of the core principles of serious game design and examine the current use of these principles in computer-based interventions for individuals with autism. Participants who undergo these computer-based interventions often show little evidence of the ability to generalize such learning to novel, everyday social communicative interactions. This lack of generalized learning may result, in part, from the limited use of fundamental elements of serious game design that are known to maximize learning. We suggest that future computer-based interventions should consider the full range of serious game design principles that promote generalization of learning.
Greenhalgh, Trisha; Jackson, Claire; Shaw, Sara; Janamian, Tina
2016-06-01
Co-creation-collaborative knowledge generation by academics working alongside other stakeholders-is an increasingly popular approach to aligning research and service development. It has potential for "moving beyond the ivory towers" to deliver significant societal impact via dynamic, locally adaptive community-academic partnerships. Principles of successful co-creation include a systems perspective, a creative approach to research focused on improving human experience, and careful attention to governance and process. If these principles are not followed, co-creation efforts may fail. Co-creation-collaborative knowledge generation by academics working alongside other stakeholders-reflects a "Mode 2" relationship (knowledge production rather than knowledge translation) between universities and society. Co-creation is widely believed to increase research impact. We undertook a narrative review of different models of co-creation relevant to community-based health services. We contrasted their diverse disciplinary roots and highlighted their common philosophical assumptions, principles of success, and explanations for failures. We applied these to an empirical case study of a community-based research-service partnership led by the Centre of Research Excellence in Quality and Safety in Integrated Primary-Secondary Care at the University of Queensland, Australia. Co-creation emerged independently in several fields, including business studies ("value co-creation"), design science ("experience-based co-design"), computer science ("technology co-design"), and community development ("participatory research"). These diverse models share some common features, which were also evident in the case study. Key success principles included (1) a systems perspective (assuming emergence, local adaptation, and nonlinearity); (2) the framing of research as a creative enterprise with human experience at its core; and (3) an emphasis on process (the framing of the program, the nature of relationships, and governance and facilitation arrangements, especially the style of leadership and how conflict is managed). In both the literature review and the case study, co-creation "failures" could often be tracked back to abandoning (or never adopting) these principles. All co-creation models made strong claims for significant and sustainable societal impacts as a result of the adaptive and developmental research process; these were illustrated in the case study. Co-creation models have high potential for societal impact but depend critically on key success principles. To capture the nonlinear chains of causation in the co-creation pathway, impact metrics must reflect the dynamic nature and complex interdependencies of health research systems and address processes as well as outcomes. © 2016 Milbank Memorial Fund.
JACKSON, CLAIRE; SHAW, SARA; JANAMIAN, TINA
2016-01-01
Policy Points: Co‐creation—collaborative knowledge generation by academics working alongside other stakeholders—is an increasingly popular approach to aligning research and service development.It has potential for “moving beyond the ivory towers” to deliver significant societal impact via dynamic, locally adaptive community‐academic partnerships.Principles of successful co‐creation include a systems perspective, a creative approach to research focused on improving human experience, and careful attention to governance and process.If these principles are not followed, co‐creation efforts may fail. Context Co‐creation—collaborative knowledge generation by academics working alongside other stakeholders—reflects a “Mode 2” relationship (knowledge production rather than knowledge translation) between universities and society. Co‐creation is widely believed to increase research impact. Methods We undertook a narrative review of different models of co‐creation relevant to community‐based health services. We contrasted their diverse disciplinary roots and highlighted their common philosophical assumptions, principles of success, and explanations for failures. We applied these to an empirical case study of a community‐based research‐service partnership led by the Centre of Research Excellence in Quality and Safety in Integrated Primary‐Secondary Care at the University of Queensland, Australia. Findings Co‐creation emerged independently in several fields, including business studies (“value co‐creation”), design science (“experience‐based co‐design”), computer science (“technology co‐design”), and community development (“participatory research”). These diverse models share some common features, which were also evident in the case study. Key success principles included (1) a systems perspective (assuming emergence, local adaptation, and nonlinearity); (2) the framing of research as a creative enterprise with human experience at its core; and (3) an emphasis on process (the framing of the program, the nature of relationships, and governance and facilitation arrangements, especially the style of leadership and how conflict is managed). In both the literature review and the case study, co‐creation “failures” could often be tracked back to abandoning (or never adopting) these principles. All co‐creation models made strong claims for significant and sustainable societal impacts as a result of the adaptive and developmental research process; these were illustrated in the case study. Conclusions Co‐creation models have high potential for societal impact but depend critically on key success principles. To capture the nonlinear chains of causation in the co‐creation pathway, impact metrics must reflect the dynamic nature and complex interdependencies of health research systems and address processes as well as outcomes. PMID:27265562
Berger, Robert F
2018-02-09
In the current decade, perovskite solar cell research has emerged as a remarkably active, promising, and rapidly developing field. Alongside breakthroughs in synthesis and device engineering, halide perovskite photovoltaic materials have been the subject of predictive and explanatory computational work. In this Minireview, we focus on a subset of this computation: density functional theory (DFT)-based work highlighting the ways in which the electronic structure and band gap of this class of materials can be tuned via changes in atomic structure. We distill this body of computational literature into a set of underlying design principles for the band gap engineering of these materials, and rationalize these principles from the viewpoint of band-edge orbital character. We hope that this perspective provides guidance and insight toward the rational design and continued improvement of perovskite photovoltaics. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Science Learning and Instruction: Taking Advantage of Technology to Promote Knowledge Integration
ERIC Educational Resources Information Center
Linn, Marcia C.; Eylon, Bat-Sheva
2011-01-01
"Science Learning and Instruction" describes advances in understanding the nature of science learning and their implications for the design of science instruction. The authors show how design patterns, design principles, and professional development opportunities coalesce to create and sustain effective instruction in each primary scientific…
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
Nursing Needs Big Data and Big Data Needs Nursing.
Brennan, Patricia Flatley; Bakken, Suzanne
2015-09-01
Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.
Computer Science and Telecommunications Board summary of activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blumenthal, M.S.
1992-03-27
The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.
Hispanic women overcoming deterrents to computer science: A phenomenological study
NASA Astrophysics Data System (ADS)
Herling, Lourdes
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.
Research in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
Computational fluid mechanics utilizing the variational principle of modeling damping seals
NASA Technical Reports Server (NTRS)
Abernathy, J. M.
1986-01-01
A computational fluid dynamics code for application to traditional incompressible flow problems has been developed. The method is actually a slight compressibility approach which takes advantage of the bulk modulus and finite sound speed of all real fluids. The finite element numerical analog uses a dynamic differencing scheme based, in part, on a variational principle for computational fluid dynamics. The code was developed in order to study the feasibility of damping seals for high speed turbomachinery. Preliminary seal analyses have been performed.
Politics of prevention: The emergence of prevention science.
Roumeliotis, Filip
2015-08-01
This article critically examines the political dimension of prevention science by asking how it constructs the problems for which prevention is seen as the solution and how it enables the monitoring and control of these problems. It also seeks to examine how prevention science has established a sphere for legitimate political deliberation and which kinds of statements are accepted as legitimate within this sphere. The material consists of 14 publications describing and discussing the goals, concepts, promises and problems of prevention science. The analysis covers the period from 1993 to 2012. The analysis shows that prevention science has established a narrow definition of "prevention", including only interventions aimed at the reduction of risks for clinical disorders. In publications from the U.S. National Institute of Drug Abuse, the principles of prevention science have enabled a commitment to a zero-tolerance policy on drugs. The drug using subject has been constructed as a rational choice actor lacking in skills in exerting self-control in regard to drug use. Prevention science has also enabled the monitoring and control of expertise, risk groups and individuals through specific forms of data gathering. Through the juxtaposition of the concepts of "objectivity" and "morality", prevention science has constituted a principle of delineation, disqualifying statements not adhering to the principles of prevention science from the political field, rendering ethical and conflictual dimensions of problem representations invisible. The valorisation of scientific accounts of drugs has acted to naturalise specific political ideals. It simultaneously marginalises the public from the public policy process, giving precedence to experts who are able to provide information that policy-makers are demanding. Alternative accounts, such as those based on marginalisation, poverty or discrimination are silenced within prevention science. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Huffman, L. T.; Blythe, D.; Dahlman, L. E.; Fischbein, S.; Johnson, K.; Kontar, Y.; Rack, F. R.; Kulhanek, D. K.; Pennycook, J.; Reed, J.; Youngman, B.; Reeves, M.; Thomas, R.
2010-12-01
The challenges of communicating climate change science to non-technical audiences present a daunting task, but one that is recognized in the science community as urgent and essential. ANDRILL's (ANtarctic geological DRILLing) international network of scientists, engineers, technicians and educators work together to convey a deeper understanding of current geoscience research as well as the process of science to non-technical audiences. One roadblock for educators who recognize the need to teach climate change has been the lack of a comprehensive, integrated set of resources and activities that are related to the National Science Education Standards. Pieces of the climate change puzzle can be found in the excellent work of the groups of science and education professionals who wrote the Essential Principles of Ocean Sciences, Climate Literacy: The Essential Principles of Climate Science, Earth Science Literacy Principles: The Big Ideas and Supporting Concepts of Earth Science, and Essential Principals and Fundamental Concepts for Atmospheric Science Literacy, but teachers have precious little time to search out the climate change goals and objectives in those frameworks and then find the resources to teach them. Through NOAA funding, ANDRILL has created a new framework, The Environmental Literacy Framework with a Focus on Climate Change (ELF), drawing on the works of the aforementioned groups, and promoting an Earth Systems approach to teaching climate change through five units: Atmosphere, Biosphere, Geosphere, Hydrosphere/Cryosphere, and Energy as the driver of interactions within and between the “spheres.” Each key concept in the framework has a hands-on, inquiry activity and matching NOAA resources for teaching the objectives. In its present form, we present a ‘road map’ for teaching climate change and a set of resources intended to continue to evolve over time.
ERIC Educational Resources Information Center
Howitt, Christine
2011-01-01
"Planting the Seeds of Science" is a new early childhood science resource developed through a collaboration between science/engineering academics, early childhood teacher educators and early childhood pre-service teachers, with funding from the Australian Learning and Teaching Council. Based on best practice early childhood principles,…
ERIC Educational Resources Information Center
Williams, Kerry Curtiss; Veomett, George E.
2006-01-01
Teaching science means doing science and involves three elements: knowing content, knowing children, and teachers knowing themselves as teachers and learners. The authors describe principles and requirements that reflect National Science Education Standards for the active learning of science. They identify key ingredients for primary students and…
ERIC Educational Resources Information Center
Roff, Lori; Stringer, Lola
The food science course developed in Missouri combines basic scientific and mathematics principles in a hands-on instructional format as a part of the family and consumer sciences education curriculum. Throughout the course, students conduct controlled experiments and use scientific laboratory techniques and information to explore the biological…
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
Design and assessment of an interactive physics tutoring environment
NASA Astrophysics Data System (ADS)
Scott, Lisa Ann
2001-07-01
The application of scientific principles is an extremely important skill taught in undergraduate introductory science courses, yet many students emerge from such courses unable to reliably apply the scientific principles they have ostensibly learned. In an attempt to address this problem, the knowledge and thought processes needed to apply an important principle in introductory physics (Newton's law) were carefully analyzed. Reliable performance requires not only declarative knowledge but also corresponding procedural knowledge and the basic cognitive functions of deciding, implementing and assessing. Computer programs called guided-practice PALs (P&barbelow;ersonal A&barbelow;ssistants for Ḻearning) were developed to teach explicitly the knowledge and thought processes needed to apply Newton's law to solve problems. These programs employ a modified form of Palincsar and Brown's reciprocal-teaching strategy (1984) in which students and computers alternately coach each other, taking turns making decisions, implementing and assessing them. The computer programs make it practically feasible to provide students with individual guidance and feedback ordinarily unavailable in most courses. In a pilot study, the guided-practice PALs were found to be nearly as effective as individual tutoring by expert teachers and significantly more effective than the instruction provided in a well-taught physics course. This guided practice however is not sufficient to ensure that students develop the ability to perform independently. Accordingly, independent-performance PALs were developed which require students to work independently, receiving only the minimal feedback necessary to successfully complete the task. These independent-performance PALS are interspersed with guided-practice PALs to create an instructional environment which facilitates a gradual transition to independent performance. In a study designed to assess the efficacy of the PAL instruction, students in the PAL group used only guided-practice PALS and students in the PAL+ group used both guided-practice and independent-performance PALS. The performance of the PAL and PAL+ groups were compared to the performance of a Control group which received traditional instruction. The addition of the independent-performance PALS proved to be at least as effective as the guided-practice PALs alone, and both forms of PAL instruction were significantly more effective than traditional instruction.
ERIC Educational Resources Information Center
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-01-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…
Kato, Hideaki E.; Nureki, Osamu
2013-01-01
Channelrhodopsin (ChR) is a light-gated cation channel derived from green algae. Since the inward flow of cations triggers the neuron firing, neurons expressing ChRs can be optically controlled even within freely moving mammals. Although ChR has been broadly applied to neuro-science research, little is known about its molecular mechanisms. We determined the crystal structure of chimeric ChR at 2.3 Å resolution and revealed its molecular architecture. The integration of structural, electrophysio-logical, and computational analyses provided insight into the molecular basis for the channel function of ChR, and paved the way for the principled design of ChR variants with novel properties. PMID:27493541
Graduate program in biomedical communication.
Ryan, S M
1969-10-01
The need for harnessing the achievements of communication technology to the burgeoning mass of biomedical information is critical. Recognizing this problem and aware of the short supply of professionals with the skills necessary for the job, a group of leaders from the fields of medicine and communications formed a consortium in 1967 and have developed a twelve month graduate program in biomedical communication. Designed to ground the advanced student in the development and administration of biomedical communication programs, the curriculum focuses on the principles and practice of communication and the development of communications media. Courses are given in the control and communication of information; the printed and spoken word; visual media of photographic arts, television, and motion pictures; computer science; and administration and systems analysis.
Graduate Program in Biomedical Communication *
Ryan, Susan M.
1969-01-01
The need for harnessing the achievements of communication technology to the burgeoning mass of biomedical information is critical. Recognizing this problem and aware of the short supply of professionals with the skills necessary for the job, a group of leaders from the fields of medicine and communications formed a consortium in 1967 and have developed a twelve month graduate program in biomedical communication. Designed to ground the advanced student in the development and administration of biomedical communication programs, the curriculum focuses on the principles and practice of communication and the development of communications media. Courses are given in the control and communication of information; the printed and spoken word; visual media of photographic arts, television, and motion pictures; computer science; and administration and systems analysis. PMID:5823505
NASA Astrophysics Data System (ADS)
Carlowicz, Michael
If you have a computer and a grasp of algebra, you can learn physics. That is one of the messages behind the release of Physics—The Root Science, a new full-text version of a physics textbook available at no cost on the World Wide Web.The interactive textbook is the work of the International Institute of Theoretical and Applied Physics (IITAP) at Iowa State University, which was established in 1993 as a partnership with the United Nations Education, Scientific, and Cultural Organization (UNESCO). With subject matter equivalent to that of a 400-page volume, the text is designed to be completed in one school year. The textbook also will eventually include video clips of experiments and interactive learning modules, as well as links to appropriate cross-references about fundamental principles of physics.
Answering Schrödinger's question: A free-energy formulation
NASA Astrophysics Data System (ADS)
Ramstead, Maxwell James Désormeau; Badcock, Paul Benjamin; Friston, Karl John
2018-03-01
The free-energy principle (FEP) is a formal model of neuronal processes that is widely recognised in neuroscience as a unifying theory of the brain and biobehaviour. More recently, however, it has been extended beyond the brain to explain the dynamics of living systems, and their unique capacity to avoid decay. The aim of this review is to synthesise these advances with a meta-theoretical ontology of biological systems called variational neuroethology, which integrates the FEP with Tinbergen's four research questions to explain biological systems across spatial and temporal scales. We exemplify this framework by applying it to Homo sapiens, before translating variational neuroethology into a systematic research heuristic that supplies the biological, cognitive, and social sciences with a computationally tractable guide to discovery.
A unified approach to computational drug discovery.
Tseng, Chih-Yuan; Tuszynski, Jack
2015-11-01
It has been reported that a slowdown in the development of new medical therapies is affecting clinical outcomes. The FDA has thus initiated the Critical Path Initiative project investigating better approaches. We review the current strategies in drug discovery and focus on the advantages of the maximum entropy method being introduced in this area. The maximum entropy principle is derived from statistical thermodynamics and has been demonstrated to be an inductive inference tool. We propose a unified method to drug discovery that hinges on robust information processing using entropic inductive inference. Increasingly, applications of maximum entropy in drug discovery employ this unified approach and demonstrate the usefulness of the concept in the area of pharmaceutical sciences. Copyright © 2015. Published by Elsevier Ltd.
Opportunities for Computational Discovery in Basic Energy Sciences
NASA Astrophysics Data System (ADS)
Pederson, Mark
2011-03-01
An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~
Research | Computational Science | NREL
Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples
Is U.S. health care an appropriate system? A strategic perspective from systems science
Janecka, Ivo P
2009-01-01
Context Systems science provides organizational principles supported by biologic findings that can be applied to any organization; any incongruence indicates an incomplete or an already failing system. U.S. health care is commonly referred to as a system that consumes an ever- increasing percentage of the gross domestic product and delivers seemingly diminishing value. Objective To perform a comparative study of U.S. health care with the principles of systems science and, if feasible, propose solutions. Design General systems theory provides the theoretical foundation for this observational research. Main Outcome Measures A degree of compliance of U.S. health care with systems principles and its space-time functional location within the dynamic systems model. Results of comparative analysis U.S. health care is an incomplete system further threatened by the fact that it functions in the zone of chaos within the dynamic systems model. Conclusion Complying with systems science principles and the congruence of pertinent cycles, U.S. health care would likely dramatically improve its value creation for all of society as well as its resiliency and long-term sustainability. Immediate corrective steps could be taken: Prioritize and incentivize health over care; restore fiscal soundness by combining health and life insurance for the benefit of the insured and the payer; rebalance horizontal/providers and vertical/government hierarchies. PMID:19121210
Computer validation in toxicology: historical review for FDA and EPA good laboratory practice.
Brodish, D L
1998-01-01
The application of computer validation principles to Good Laboratory Practice is a fairly recent phenomenon. As automated data collection systems have become more common in toxicology facilities, the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency have begun to focus inspections in this area. This historical review documents the development of regulatory guidance on computer validation in toxicology over the past several decades. An overview of the components of a computer life cycle is presented, including the development of systems descriptions, validation plans, validation testing, system maintenance, SOPs, change control, security considerations, and system retirement. Examples are provided for implementation of computer validation principles on laboratory computer systems in a toxicology facility.
ERIC Educational Resources Information Center
Coles, Mike; Nelms, Rick
1996-01-01
Describes a study that explores the depth and breadth of scientific facts, principles, and procedures which are required in the Advanced General National Vocational Qualifications (GNVQ) science through comparison with GCE Advanced level. The final report takes account of the updated 1996 version of GNVQ science. (DDR)
Earth Science Principles Pertinent to the General Education Programs in Junior High Schools
ERIC Educational Resources Information Center
Henson, Kenneth Tyrone
1970-01-01
Presents the procedures, and findings of a study designed to identify principles in astronomy, geology, meterology, oceanography and physical geography pertinent to general education programs in junior high schools. (LC)
Integrating the Principles of Toxicology into a Chemistry Curriculum
Designing safer products, processes and materials requires a commitment to engaging a transdisciplinary, systems approach utilizing the principles of chemistry, toxicology, environmental sciences and other allied disciplines. Chemistry and toxicology are inherently complementary ...
High-Performance Java Codes for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.
NASA Astrophysics Data System (ADS)
Huhn, William Paul; Lange, Björn; Yu, Victor; Blum, Volker; Lee, Seyong; Yoon, Mina
Density-functional theory has been well established as the dominant quantum-mechanical computational method in the materials community. Large accurate simulations become very challenging on small to mid-scale computers and require high-performance compute platforms to succeed. GPU acceleration is one promising approach. In this talk, we present a first implementation of all-electron density-functional theory in the FHI-aims code for massively parallel GPU-based platforms. Special attention is paid to the update of the density and to the integration of the Hamiltonian and overlap matrices, realized in a domain decomposition scheme on non-uniform grids. The initial implementation scales well across nodes on ORNL's Titan Cray XK7 supercomputer (8 to 64 nodes, 16 MPI ranks/node) and shows an overall speed up in runtime due to utilization of the K20X Tesla GPUs on each Titan node of 1.4x, with the charge density update showing a speed up of 2x. Further acceleration opportunities will be discussed. Work supported by the LDRD Program of ORNL managed by UT-Battle, LLC, for the U.S. DOE and by the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.
NASA Astrophysics Data System (ADS)
Miyazato, Itsuki; Tanaka, Yuzuru; Takahashi, Keisuke
2018-02-01
Two-dimensional (2D) magnets are explored in terms of data science and first principle calculations. Machine learning determines four descriptors for predicting the magnetic moments of 2D materials within reported 216 2D materials data. With the trained machine, 254 2D materials are predicted to have high magnetic moments. First principle calculations are performed to evaluate the predicted 254 2D materials where eight undiscovered stable 2D materials with high magnetic moments are revealed. The approach taken in this work indicates that undiscovered materials can be surfaced by utilizing data science and materials data, leading to an innovative way of discovering hidden materials.
The Stratigraphic Sandwich. An Inquiry-Based Lesson on Geologic Principles
ERIC Educational Resources Information Center
Hermann, Ronald S.; Miranda, Rommel J.
2013-01-01
This article describes an approach in which students develop and apply definitions prior to their formal introduction to new vocabulary. The example given is an inquiry-based lesson on geologic principles. This approach is illustrated with a lesson that has been used with high school Earth science students on the principles of stratigraphy, though…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansmann, Ulrich H.E.
2012-07-02
This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previousmore » years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.« less
Materials Databases Infrastructure Constructed by First Principles Calculations: A Review
Lin, Lianshan
2015-10-13
The First Principles calculations, especially the calculation based on High-Throughput Density Functional Theory, have been widely accepted as the major tools in atom scale materials design. The emerging super computers, along with the powerful First Principles calculations, have accumulated hundreds of thousands of crystal and compound records. The exponential growing of computational materials information urges the development of the materials databases, which not only provide unlimited storage for the daily increasing data, but still keep the efficiency in data storage, management, query, presentation and manipulation. This review covers the most cutting edge materials databases in materials design, and their hotmore » applications such as in fuel cells. By comparing the advantages and drawbacks of these high-throughput First Principles materials databases, the optimized computational framework can be identified to fit the needs of fuel cell applications. The further development of high-throughput DFT materials database, which in essence accelerates the materials innovation, is discussed in the summary as well.« less
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Design and exploration of semiconductors from first principles: A review of recent advances
NASA Astrophysics Data System (ADS)
Oba, Fumiyasu; Kumagai, Yu
2018-06-01
Recent first-principles approaches to semiconductors are reviewed, with an emphasis on theoretical insight into emerging materials and in silico exploration of as-yet-unreported materials. As relevant theory and methodologies have developed, along with computer performance, it is now feasible to predict a variety of material properties ab initio at the practical level of accuracy required for detailed understanding and elaborate design of semiconductors; these material properties include (i) fundamental bulk properties such as band gaps, effective masses, dielectric constants, and optical absorption coefficients; (ii) the properties of point defects, including native defects, residual impurities, and dopants, such as donor, acceptor, and deep-trap levels, and formation energies, which determine the carrier type and density; and (iii) absolute and relative band positions, including ionization potentials and electron affinities at semiconductor surfaces, band offsets at heterointerfaces between dissimilar semiconductors, and Schottky barrier heights at metal–semiconductor interfaces, which are often discussed systematically using band alignment or lineup diagrams. These predictions from first principles have made it possible to elucidate the characteristics of semiconductors used in industry, including group III–V compounds such as GaN, GaP, and GaAs and their alloys with related Al and In compounds; amorphous oxides, represented by In–Ga–Zn–O transparent conductive oxides (TCOs), represented by In2O3, SnO2, and ZnO; and photovoltaic absorber and buffer layer materials such as CdTe and CdS among group II–VI compounds and chalcopyrite CuInSe2, CuGaSe2, and CuIn1‑ x Ga x Se2 (CIGS) alloys, in addition to the prototypical elemental semiconductors Si and Ge. Semiconductors attracting renewed or emerging interest have also been investigated, for instance, divalent tin compounds, including SnO and SnS; wurtzite-derived ternary compounds such as ZnSnN2 and CuGaO2; perovskite oxides such as SrTiO3 and BaSnO3; and organic–inorganic hybrid perovskites, represented by CH3NH3PbI3. Moreover, the deployment of first-principles calculations allows us to predict the crystal structure, stability, and properties of as-yet-unreported materials. Promising materials have been explored via high-throughput screening within either publicly available computational databases or unexplored composition and structure space. Reported examples include the identification of nitride semiconductors, TCOs, solar cell photoabsorber materials, and photocatalysts, some of which have been experimentally verified. Machine learning in combination with first-principles calculations has emerged recently as a technique to accelerate and enhance in silico screening. A blend of computation and experimentation with data science toward the development of materials is often referred to as materials informatics and is currently attracting growing interest.
ERIC Educational Resources Information Center
Pruett, Sharon M.
2012-01-01
The objective of this study was to compare the relationships between the subtests of the Interactive Computer Interview System and the ETS "Praxis II" Principles of Learning and Teaching examination. In particular, this study compares scores on the ICIS instrument subtests to those gathered from the same classroom teachers on the…
ERIC Educational Resources Information Center
Hammonds, S. J.
1990-01-01
A technique for the numerical identification of bacteria using normalized likelihoods calculated from a probabilistic database is described, and the principles of the technique are explained. The listing of the computer program is included. Specimen results from the program, and examples of how they should be interpreted, are given. (KR)
Girls Save the World through Computer Science
ERIC Educational Resources Information Center
Murakami, Christine
2011-01-01
It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…
ERIC Educational Resources Information Center
Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2015-01-01
The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…
Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study
ERIC Educational Resources Information Center
Herling, Lourdes
2011-01-01
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…
ERIC Educational Resources Information Center
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-01-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…
Extinction from a rationalist perspective.
Gallistel, C R
2012-05-01
The merging of the computational theory of mind and evolutionary thinking leads to a kind of rationalism, in which enduring truths about the world have become implicit in the computations that enable the brain to cope with the experienced world. The dead reckoning computation, for example, is implemented within the brains of animals as one of the mechanisms that enables them to learn where they are (Gallistel, 1990, 1995). It integrates a velocity signal with respect to a time signal. Thus, the manner in which position and velocity relate to one another in the world is reflected in the manner in which signals representing those variables are processed in the brain. I use principles of information theory and Bayesian inference to derive from other simple principles explanations for: (1) the failure of partial reinforcement to increase reinforcements to acquisition; (2) the partial reinforcement extinction effect; (3) spontaneous recovery; (4) renewal; (5) reinstatement; (6) resurgence (aka facilitated reacquisition). Like the principle underlying dead-reckoning, these principles are grounded in analytic considerations. They are the kind of enduring truths about the world that are likely to have shaped the brain's computations. Copyright © 2012 Elsevier B.V. All rights reserved.
Non-Determinism: An Abstract Concept in Computer Science Studies
ERIC Educational Resources Information Center
Armoni, Michal; Gal-Ezer, Judith
2007-01-01
Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…