Non-Determinism: An Abstract Concept in Computer Science Studies
ERIC Educational Resources Information Center
Armoni, Michal; Gal-Ezer, Judith
2007-01-01
Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…
Teaching and Learning Methodologies Supported by ICT Applied in Computer Science
ERIC Educational Resources Information Center
Capacho, Jose
2016-01-01
The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…
Theory-Guided Technology in Computer Science.
ERIC Educational Resources Information Center
Ben-Ari, Mordechai
2001-01-01
Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…
On Evaluating Human Problem Solving of Computationally Hard Problems
ERIC Educational Resources Information Center
Carruthers, Sarah; Stege, Ulrike
2013-01-01
This article is concerned with how computer science, and more exactly computational complexity theory, can inform cognitive science. In particular, we suggest factors to be taken into account when investigating how people deal with computational hardness. This discussion will address the two upper levels of Marr's Level Theory: the computational…
ERIC Educational Resources Information Center
Singh, Gurmukh
2012-01-01
The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…
1988-07-08
Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36
Situated Learning in Computer Science Education
ERIC Educational Resources Information Center
Ben-Ari, Mordechai
2004-01-01
Sociocultural theories of learning such as Wenger and Lave's situated learning have been suggested as alternatives to cognitive theories of learning like constructivism. This article examines situated learning within the context of computer science (CS) education. Situated learning accurately describes some CS communities like open-source software…
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)
NASA Astrophysics Data System (ADS)
2017-09-01
The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary
Topics in Computational Learning Theory and Graph Algorithms.
ERIC Educational Resources Information Center
Board, Raymond Acton
This thesis addresses problems from two areas of theoretical computer science. The first area is that of computational learning theory, which is the study of the phenomenon of concept learning using formal mathematical models. The goal of computational learning theory is to investigate learning in a rigorous manner through the use of techniques…
ERIC Educational Resources Information Center
Giannakos, Michail N.
2014-01-01
Computer Science (CS) courses comprise both Programming and Information and Communication Technology (ICT) issues; however these two areas have substantial differences, inter alia the attitudes and beliefs of the students regarding the intended learning content. In this research, factors from the Social Cognitive Theory and Unified Theory of…
Semiotics, Information Science, Documents and Computers.
ERIC Educational Resources Information Center
Warner, Julian
1990-01-01
Discusses the relationship and value of semiotics to the established domains of information science. Highlights include documentation; computer operations; the language of computing; automata theory; linguistics; speech and writing; and the written language as a unifying principle for the document and the computer. (93 references) (LRW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dang, Liem X.; Vo, Quynh N.; Nilsson, Mikael
We report one of the first simulations using a classical rate theory approach to predict the mechanism of the exchange process between water and aqueous uranyl ions. Using our water and ion-water polarizable force fields and molecular dynamics techniques, we computed the potentials of mean force for the uranyl ion-water pair as the function of pressures at ambient temperature. Subsequently, these simulated potentials of mean force were used to calculate rate constants using the transition rate theory; the time dependent transmission coefficients were also examined using the reactive flux method and Grote-Hynes treatments of the dynamic response of the solvent.more » The computed activation volumes using transition rate theory and the corrected rate constants are positive, thus the mechanism of this particular water-exchange is a dissociative process. We discuss our rate theory results and compare them with previously studies in which non-polarizable force fields were used. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less
Implicit Theories of Creativity in Computer Science in the United States and China
ERIC Educational Resources Information Center
Tang, Chaoying; Baer, John; Kaufman, James C.
2015-01-01
To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…
MaRIE theory, modeling and computation roadmap executive summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lookman, Turab
The confluence of MaRIE (Matter-Radiation Interactions in Extreme) and extreme (exascale) computing timelines offers a unique opportunity in co-designing the elements of materials discovery, with theory and high performance computing, itself co-designed by constrained optimization of hardware and software, and experiments. MaRIE's theory, modeling, and computation (TMC) roadmap efforts have paralleled 'MaRIE First Experiments' science activities in the areas of materials dynamics, irradiated materials and complex functional materials in extreme conditions. The documents that follow this executive summary describe in detail for each of these areas the current state of the art, the gaps that exist and the road mapmore » to MaRIE and beyond. Here we integrate the various elements to articulate an overarching theme related to the role and consequences of heterogeneities which manifest as competing states in a complex energy landscape. MaRIE experiments will locate, measure and follow the dynamical evolution of these heterogeneities. Our TMC vision spans the various pillar science and highlights the key theoretical and experimental challenges. We also present a theory, modeling and computation roadmap of the path to and beyond MaRIE in each of the science areas.« less
20 CFR 901.11 - Enrollment procedures.
Code of Federal Regulations, 2011 CFR
2011-04-01
.... Examples include economics, computer programs, pension accounting, investment and finance, risk theory... Columbia responsible for the issuance of a license in the field of actuarial science, insurance, accounting... include economics, computer programming, pension accounting, investment and finance, risk theory...
2005-12-01
Computational Learning in the Department of Brain & Cognitive Sciences and in the Computer Science and Artificial Intelligence Laboratory at the Massachusetts...physiology and cognitive science . . . . . . . . . . . . . . . . . . . . . 67 2 CONTENTS A Appendices 68 A.1 Detailed model implementation and...physiol- ogy to cognitive science. The original model [Riesenhuber and Poggio, 1999b] made also a few predictions ranging from biophysics to psychophysics
Examination and Implementation of a Proposal for a Ph.D. Program in Administrative Sciences
1992-03-01
Review of two proposals recently approved by the Academic Council (i.e., Computer Science and Mathematics Departments). C. SCOPE OF THE STUDY Since WWII...and through the computer age, the application of administrative science theory and methodologies from the behavioral sciences and quantitative...roles in the U.S. Navy and DoD, providing people who firmly understand the technical and organizational aspects of computer -based systems which support
ERIC Educational Resources Information Center
Ryoo, Jean; Goode, Joanna; Margolis, Jane
2015-01-01
This article describes the importance that high school computer science teachers place on a teachers' professional learning community designed around an inquiry- and equity-oriented approach for broadening participation in computing. Using grounded theory to analyze four years of teacher surveys and interviews from the Exploring Computer Science…
NASA Astrophysics Data System (ADS)
Rodriguez, Sarah L.; Lehman, Kathleen
2017-10-01
This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.
Education, Information Technology and Cognitive Science.
ERIC Educational Resources Information Center
Scaife, M.
1989-01-01
Discusses information technology and its effects on developmental psychology and children's education. Topics discussed include a theory of child-computer interaction (CCI); programing; communication and computers, including electronic mail; cognitive science; artificial intelligence; modeling the user-system interaction; and the future of…
Information processing, computation, and cognition.
Piccinini, Gualtiero; Scarantino, Andrea
2011-01-01
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.
Complex systems and health behavior change: insights from cognitive science.
Orr, Mark G; Plaut, David C
2014-05-01
To provide proof-of-concept that quantum health behavior can be instantiated as a computational model that is informed by cognitive science, the Theory of Reasoned Action, and quantum health behavior theory. We conducted a synthetic review of the intersection of quantum health behavior change and cognitive science. We conducted simulations, using a computational model of quantum health behavior (a constraint satisfaction artificial neural network) and tested whether the model exhibited quantum-like behavior. The model exhibited clear signs of quantum-like behavior. Quantum health behavior can be conceptualized as constraint satisfaction: a mitigation between current behavioral state and the social contexts in which it operates. We outlined implications for moving forward with computational models of both quantum health behavior and health behavior in general.
LBNL Computational ResearchTheory Facility Groundbreaking - Full Press Conference. Feb 1st, 2012
Yelick, Kathy
2018-01-24
Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.
LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yelick, Kathy
2012-02-02
Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.
An Enduring Dialogue between Computational and Empirical Vision.
Martinez-Conde, Susana; Macknik, Stephen L; Heeger, David J
2018-04-01
In the late 1970s, key discoveries in neurophysiology, psychophysics, computer vision, and image processing had reached a tipping point that would shape visual science for decades to come. David Marr and Ellen Hildreth's 'Theory of edge detection', published in 1980, set out to integrate the newly available wealth of data from behavioral, physiological, and computational approaches in a unifying theory. Although their work had wide and enduring ramifications, their most important contribution may have been to consolidate the foundations of the ongoing dialogue between theoretical and empirical vision science. Copyright © 2018 Elsevier Ltd. All rights reserved.
LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012
Yelick, Kathy
2017-12-09
Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.
From Requirements to Code: Issues and Learning in IS Students' Systems Development Projects
ERIC Educational Resources Information Center
Scott, Elsje
2008-01-01
The Computing Curricula (2005) place Information Systems (IS) at the intersection of exact sciences (e.g. General Systems Theory), technology (e.g. Computer Science), and behavioral sciences (e.g. Sociology). This presents particular challenges for teaching and learning, as future IS professionals need to be equipped with a wide range of…
Increasing the Interest of Elementary Age Students in Computer Science though Day Camps
ERIC Educational Resources Information Center
Cliburn, Dan; Weisheit, Tracey; Griffith, Jason; Jones, Matt; Rackley, Hunter; Richey, Eric; Stormer, Kevin
2004-01-01
Computer Science and related majors have seen a decrease in enrollment across the country in recent years. While there are several theories behind why this may be the case, as educators in many areas of computing and information technology, this is a trend we should attempt to reverse. While it is true that many children are "computer literate",…
A Survey of Computer Science Capstone Course Literature
ERIC Educational Resources Information Center
Dugan, Robert F., Jr.
2011-01-01
In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…
Mastering cognitive development theory in computer science education
NASA Astrophysics Data System (ADS)
Gluga, Richard; Kay, Judy; Lister, Raymond; Simon; Kleitman, Sabina
2013-03-01
To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that classified activities and assessments are comparable across the subjects of a degree, and, ideally, comparable across institutions. One widespread approach to supporting this is to write learning objects in terms of Bloom's Taxonomy. This, or other such classifications, is likely to be more effective if educators can use them consistently, in the way experts would use them. To this end, we present the design and evaluation of our online interactive web-based tutorial system, which can be configured and used to offer training in different classification schemes. We report on results from three evaluations. First, 17 computer science educators complete a tutorial on using Bloom's Taxonomy to classify programming examination questions. Second, 20 computer science educators complete a Neo-Piagetian tutorial. Third evaluation was a comparison of inter-rater reliability scores of computer science educators classifying programming questions using Bloom's Taxonomy, before and after taking our tutorial. Based on the results from these evaluations, we discuss the effectiveness of our tutorial system design for teaching computer science educators how to systematically and consistently classify programming examination questions. We also discuss the suitability of Bloom's Taxonomy and Neo-Piagetian theory for achieving this goal. The Bloom's and Neo-Piagetian tutorials are made available as a community resource. The contributions of this paper are the following: the tutorial system for learning classification schemes for the purpose of coding the difficulty of computing learning materials; its evaluation; new insights into the consistency that computing educators can achieve using Bloom; and first insights into the use of Neo-Piagetian theory by a group of classifiers.
ERIC Educational Resources Information Center
Wheeler, David L.
1988-01-01
Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)
What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?
ERIC Educational Resources Information Center
Cushion, Steve
2006-01-01
We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
ASCR Workshop on Quantum Computing for Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward
This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less
Mastering Cognitive Development Theory in Computer Science Education
ERIC Educational Resources Information Center
Gluga, Richard; Kay, Judy; Lister, Raymond; Kleitman, Simon; Kleitman, Sabina
2013-01-01
To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that…
Applying IRSS Theory: The Clark Atlanta University Exemplar
ERIC Educational Resources Information Center
Payton, Fay Cobb; Suarez-Brown, Tiki L.; Smith Lamar, Courtney
2012-01-01
The percentage of underrepresented minorities (African-American, Hispanic, Native Americans) that have obtained graduate level degrees within computing disciplines (computer science, computer information systems, computer engineering, and information technology) is dismal at best. Despite the fact that academia, the computing workforce,…
The journey from forensic to predictive materials science using density functional theory
Schultz, Peter A.
2017-09-12
Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.
The journey from forensic to predictive materials science using density functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter A.
Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.
ERIC Educational Resources Information Center
Thomas, Lewis
1981-01-01
Presents a viewpoint concerning the impact of recent scientific advances on society. Discusses biological discoveries, space exploration, computer technology, development of new astronomical theories, the behavioral sciences, and basic research. Challenges to keeping science current with technological advancement are also discussed. (DS)
NASA Astrophysics Data System (ADS)
Marzari, Nicola
The last 30 years have seen the steady and exhilarating development of powerful quantum-simulation engines for extended systems, dedicated to the solution of the Kohn-Sham equations of density-functional theory, often augmented by density-functional perturbation theory, many-body perturbation theory, time-dependent density-functional theory, dynamical mean-field theory, and quantum Monte Carlo. Their implementation on massively parallel architectures, now leveraging also GPUs and accelerators, has started a massive effort in the prediction from first principles of many or of complex materials properties, leading the way to the exascale through the combination of HPC (high-performance computing) and HTC (high-throughput computing). Challenges and opportunities abound: complementing hardware and software investments and design; developing the materials' informatics infrastructure needed to encode knowledge into complex protocols and workflows of calculations; managing and curating data; resisting the complacency that we have already reached the predictive accuracy needed for materials design, or a robust level of verification of the different quantum engines. In this talk I will provide an overview of these challenges, with the ultimate prize being the computational understanding, prediction, and design of properties and performance for novel or complex materials and devices.
Information processing, computation, and cognition
Scarantino, Andrea
2010-01-01
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958
NASA Astrophysics Data System (ADS)
Ryoo, Jean; Goode, Joanna; Margolis, Jane
2015-10-01
This article describes the importance that high school computer science teachers place on a teachers' professional learning community designed around an inquiry- and equity-oriented approach for broadening participation in computing. Using grounded theory to analyze four years of teacher surveys and interviews from the Exploring Computer Science (ECS) program in the Los Angeles Unified School District, this article describes how participating in professional development activities purposefully aimed at fostering a teachers' professional learning community helps ECS teachers make the transition to an inquiry-based classroom culture and break professional isolation. This professional learning community also provides experiences that challenge prevalent deficit notions and stereotypes about which students can or cannot excel in computer science.
Parameter Networks: Towards a Theory of Low-level Vision,
1981-04-01
8217Iels suc(h ,-s thiose shown in 1ligure 7 to reorganize origami wo.d- figures. Figoure?7. 1’o show an example In detail, Kender’s techn!Ciue for...Compuiter Science Dept, Carnegie-.Mcllon U., October 1979. Kanade, Tl., "A theory of Origami world," CMU-CS-78-144, Computer Science Dept, Carnegie
Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High -performance Computing Grid Computing Networking Mass Storage Plan for the Future State of the Laboratory Homeland Security Industry Computing Sciences Workforce Development A Growing List Historic Results
Concepts as Semantic Pointers: A Framework and Computational Model
ERIC Educational Resources Information Center
Blouw, Peter; Solodkin, Eugene; Thagard, Paul; Eliasmith, Chris
2016-01-01
The reconciliation of theories of concepts based on prototypes, exemplars, and theory-like structures is a longstanding problem in cognitive science. In response to this problem, researchers have recently tended to adopt either hybrid theories that combine various kinds of representational structure, or eliminative theories that replace concepts…
Fields, Chris
2015-12-01
Does perception hide the truth? Information theory, computer science, and quantum theory all suggest that the answer is "yes." They suggest, indeed, that useful perception is only feasible because the truth can be hidden.
Simulating Serious Games: A Discrete-Time Computational Model Based on Cognitive Flow Theory
ERIC Educational Resources Information Center
Westera, Wim
2018-01-01
This paper presents a computational model for simulating how people learn from serious games. While avoiding the combinatorial explosion of a games micro-states, the model offers a meso-level pathfinding approach, which is guided by cognitive flow theory and various concepts from learning sciences. It extends a basic, existing model by exposing…
The Concept of Energy in Psychological Theory. Cognitive Science Program, Technical Report No. 86-2.
ERIC Educational Resources Information Center
Posner, Michael I.; Rothbart, Mary Klevjord
This paper describes a basic framework for integration of computational and energetic concepts in psychological theory. The framework is adapted from a general effort to understand the neural systems underlying cognition. The element of the cognitive system that provides the best basis for attempting to relate energetic and computational ideas is…
The tractable cognition thesis.
Van Rooij, Iris
2008-09-01
The recognition that human minds/brains are finite systems with limited resources for computation has led some researchers to advance the Tractable Cognition thesis: Human cognitive capacities are constrained by computational tractability. This thesis, if true, serves cognitive psychology by constraining the space of computational-level theories of cognition. To utilize this constraint, a precise and workable definition of "computational tractability" is needed. Following computer science tradition, many cognitive scientists and psychologists define computational tractability as polynomial-time computability, leading to the P-Cognition thesis. This article explains how and why the P-Cognition thesis may be overly restrictive, risking the exclusion of veridical computational-level theories from scientific investigation. An argument is made to replace the P-Cognition thesis by the FPT-Cognition thesis as an alternative formalization of the Tractable Cognition thesis (here, FPT stands for fixed-parameter tractable). Possible objections to the Tractable Cognition thesis, and its proposed formalization, are discussed, and existing misconceptions are clarified. 2008 Cognitive Science Society, Inc.
Research on application of intelligent computation based LUCC model in urbanization process
NASA Astrophysics Data System (ADS)
Chen, Zemin
2007-06-01
Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents of complexity science research and the conception of complexity feature to reveal the complexity features of LUCC research in urbanization process. Urban space system is a complex economic and cultural phenomenon as well as a social process, is the comprehensive characterization of urban society, economy and culture, and is a complex space system formed by society, economy and nature. It has dissipative structure characteristics, such as opening, dynamics, self-organization, non-balance etc. Traditional model cannot simulate these social, economic and natural driving forces of LUCC including main feedback relation from LUCC to driving force. 2. Establishment of Markov extended model of LUCC analog research in urbanization process. Firstly, use traditional LUCC research model to compute change speed of regional land use through calculating dynamic degree, exploitation degree and consumption degree of land use; use the theory of fuzzy set to rewrite the traditional Markov model, establish structure transfer matrix of land use, forecast and analyze dynamic change and development trend of land use, and present noticeable problems and corresponding measures in urbanization process according to research results. 3. Application of intelligent computation research and complexity science research method in LUCC analog model in urbanization process. On the basis of detailed elaboration of the theory and the model of LUCC research in urbanization process, analyze the problems of existing model used in LUCC research (namely, difficult to resolve many complexity phenomena in complex urban space system), discuss possible structure realization forms of LUCC analog research in combination with the theories of intelligent computation and complexity science research. Perform application analysis on BP artificial neural network and genetic algorithms of intelligent computation and CA model and MAS technology of complexity science research, discuss their theoretical origins and their own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.
A Dynamic Intranet-Based Online-Portal Support for Computer Science Teaching
ERIC Educational Resources Information Center
Iyer, Viswanathan K.
2017-01-01
This paper addresses the issue of effective content-delivery of Computer Science subjects taking advantage of a university intranet. The proposal described herein for teaching a subject like Combinatorics and Graph Theory (CGT) is to supplement lectures with a moderated online forum against an associated intranet portal, which is referred to as a…
How robotics programs influence young women's career choices : a grounded theory model
NASA Astrophysics Data System (ADS)
Craig, Cecilia Dosh-Bluhm
The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced young women's career decisions and the program's effect on engineering, physics, and computer science career interests. To test this, a study was mounted to explore how the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition (FRC) program influenced young women's college major and career choices. Career theories suggested that experiential programs coupled with supportive relationships strongly influence career decisions, especially for science, technology, engineering, and mathematics careers. The study explored how and when young women made career decisions and how the experiential program and! its mentors and role models influenced career choice. Online focus groups and interviews (online and face-to-face) with 10 female FRC alumnae and GT processes (inductive analysis, open coding, categorizations using mind maps and content clouds) were used to generate a general systems theory style model of the career decision process for these young women. The study identified gender stereotypes and other career obstacles for women. The study's conclusions include recommendations to foster connections to real-world challenges, to develop training programs for mentors, and to nurture social cohesion, a mostly untapped area. Implementing these recommendations could help grow a critical mass of women in engineering, physics, and computer science careers, a social change worth pursuing.
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
Research 1970/1971: Annual Progress Report.
ERIC Educational Resources Information Center
Georgia Inst. of Tech., Atlanta. Science Information Research Center.
The report presents a summary of science information research activities of the School of Information and Computer Science, Georgia Institute of Technology. Included are project reports on interrelated studies in science information, information processing and systems design, automata and systems theories, and semiotics and linguistics. Also…
Theoretical computer science and the natural sciences
NASA Astrophysics Data System (ADS)
Marchal, Bruno
2005-12-01
I present some fundamental theorems in computer science and illustrate their relevance in Biology and Physics. I do not assume prerequisites in mathematics or computer science beyond the set N of natural numbers, functions from N to N, the use of some notational conveniences to describe functions, and at some point, a minimal amount of linear algebra and logic. I start with Cantor's transcendental proof by diagonalization of the non enumerability of the collection of functions from natural numbers to the natural numbers. I explain why this proof is not entirely convincing and show how, by restricting the notion of function in terms of discrete well defined processes, we are led to the non algorithmic enumerability of the computable functions, but also-through Church's thesis-to the algorithmic enumerability of partial computable functions. Such a notion of function constitutes, with respect to our purpose, a crucial generalization of that concept. This will make easy to justify deep and astonishing (counter-intuitive) incompleteness results about computers and similar machines. The modified Cantor diagonalization will provide a theory of concrete self-reference and I illustrate it by pointing toward an elementary theory of self-reproduction-in the Amoeba's way-and cellular self-regeneration-in the flatworm Planaria's way. To make it easier, I introduce a very simple and powerful formal system known as the Schoenfinkel-Curry combinators. I will use the combinators to illustrate in a more concrete way the notion introduced above. The combinators, thanks to their low-level fine grained design, will also make it possible to make a rough but hopefully illuminating description of the main lessons gained by the careful observation of nature, and to describe some new relations, which should exist between computer science, the science of life and the science of inert matter, once some philosophical, if not theological, hypotheses are made in the cognitive sciences. In the last section, I come back to self-reference and I give an exposition of its modal logics. This is used to show that theoretical computer science makes those “philosophical hypotheses” in theoretical cognitive science experimentally and mathematically testable.
ERIC Educational Resources Information Center
Shor, Mikhael
2003-01-01
States making game theory relevant and accessible to students is challenging. Describes the primary goal of GameTheory.net is to provide interactive teaching tools. Indicates the site strives to unite educators from economics, political and computer science, and ecology by providing a repository of lecture notes and tests for courses using…
Challenges in Computational Social Modeling and Simulation for National Security Decision Making
2011-06-01
This study is grounded within a system-activity theory , a logico-philosophical model of interdisciplinary research [13, 14], the concepts of social...often a difficult challenge. Ironically, social science research methods , such as ethnography , may be tremendously helpful in designing these...social sciences. Moreover, CSS projects draw on knowledge and methods from other fields of study , including graph theory , information visualization
Collective Computation of Neural Network
1990-03-15
Sciences, Beijing ABSTRACT Computational neuroscience is a new branch of neuroscience originating from current research on the theory of computer...scientists working in artificial intelligence engineering and neuroscience . The paper introduces the collective computational properties of model neural...vision research. On this basis, the authors analyzed the significance of the Hopfield model. Key phrases: Computational Neuroscience , Neural Network, Model
ERIC Educational Resources Information Center
Rias, Riaza Mohd; Zaman, Halimah Badioze
2011-01-01
Higher learning based instruction may be primarily concerned in most cases with the content of their academic lessons, and not very much with their instructional delivery. However, the effective application of learning theories and technology in higher education has an impact on student performance. With the rapid progress in the computer and…
Informatics with Systems Science and Cybernetics--Concepts and Definitions.
ERIC Educational Resources Information Center
Samuelson, Kjell
This dictionary defines information science, computer science, systems theory, and cybernetic terms in English and provides the Swedish translation of each term. An index of Swedish terms refers the user to the page where the English equivalent and definition appear. Most of the 38 references listed are in English. (RAA)
NASA Astrophysics Data System (ADS)
Govoni, Marco; Galli, Giulia
Green's function based many-body perturbation theory (MBPT) methods are well established approaches to compute quasiparticle energies and electronic lifetimes. However, their application to large systems - for instance to heterogeneous systems, nanostructured, disordered, and defective materials - has been hindered by high computational costs. We will discuss recent MBPT methodological developments leading to an efficient formulation of electron-electron and electron-phonon interactions, and that can be applied to systems with thousands of electrons. Results using a formulation that does not require the explicit calculation of virtual states, nor the storage and inversion of large dielectric matrices will be presented. We will discuss data collections obtained using the WEST code, the advantages of the algorithms used in WEST over standard techniques, and the parallel performance. Work done in collaboration with I. Hamada, R. McAvoy, P. Scherpelz, and H. Zheng. This work was supported by MICCoM, as part of the Computational Materials Sciences Program funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division and by ANL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy, Santanu; Dang, Liem X.
In this paper, we present the first computer simulation of methanol exchange dynamics between the first and second solvation shells around different cations and anions. After water, methanol is the most frequently used solvent for ions. Methanol has different structural and dynamical properties than water, so its ion solvation process is different. To this end, we performed molecular dynamics simulations using polarizable potential models to describe methanol-methanol and ion-methanol interactions. In particular, we computed methanol exchange rates by employing the transition state theory, the Impey-Madden-McDonald method, the reactive flux approach, and the Grote-Hynes theory. We observed that methanol exchange occursmore » at a nanosecond time scale for Na+ and at a picosecond time scale for other ions. We also observed a trend in which, for like charges, the exchange rate is slower for smaller ions because they are more strongly bound to methanol. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less
ERIC Educational Resources Information Center
Longenecker, Herbert E., Jr.; Babb, Jeffry; Waguespack, Leslie J.; Janicki, Thomas N.; Feinstein, David
2015-01-01
The evolution of computing education spans a spectrum from "computer science" ("CS") grounded in the theory of computing, to "information systems" ("IS"), grounded in the organizational application of data processing. This paper reports on a project focusing on a particular slice of that spectrum commonly…
Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo
2018-01-01
It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.
Fermilab | Science at Fermilab | Experiments & Projects | Intensity
Theory Computing High-performance Computing Grid Computing Networking Mass Storage Plan for the Future List Historic Results Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library Visual Media Services Timeline History High-Energy Physics Accelerator
ERIC Educational Resources Information Center
Navarro, Aaron B.
1981-01-01
Presents a program in Level II BASIC for a TRS-80 computer that simulates a Turing machine and discusses the nature of the device. The program is run interactively and is designed to be used as an educational tool by computer science or mathematics students studying computational or automata theory. (MP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2005-12-27
Graph theory is a branch of discrete combinatorial mathematics that studies the properties of graphs. The theory was pioneered by the Swiss mathematician Leonhard Euler in the 18th century, commenced its formal development during the second half of the 19th century, and has witnessed substantial growth during the last seventy years, with applications in areas as diverse as engineering, computer science, physics, sociology, chemistry and biology. Graph theory has also had a strong impact in computational linguistics by providing the foundations for the theory of features structures that has emerged as one of the most widely used frameworks for themore » representation of grammar formalisms.« less
Leveraging e-Science infrastructure for electrochemical research.
Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F
2011-08-28
As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Tsun-Mei; Dang, Liem X.
Using our polarizable force-field models and employing classical rate theories of chemical reactions, we examine the ethylene carbonate (EC) exchange process between the first and second solvation shells around Li+ and the dissociation kinetics of ion pairs Li+-[BF4] and Li+-[PF6] in this solvent. We calculate the exchange rates using transition state theory and correct them with transmission coefficients computed by the reactive flux; Impey, Madden, and McDonald approaches; and Grote-Hynes theory. We found the residence times of EC around Li+ ions varied from 70 to 450 ps, depending on the correction method used. We found the relaxation times changed significantlymore » from Li+-[BF4] to Li+-[PF6] ion pairs in EC. Our results also show that, in addition to affecting the free energy of dissociation in EC, the anion type also significantly influence the dissociation kinetics of ion pairing. This work was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less
food science. Matthew's research at NREL is focused on applying uncertainty quantification techniques . Research Interests Uncertainty quantification Computational multilinear algebra Approximation theory of and the Canonical Tensor Decomposition, Journal of Computational Physics (2017) Randomized Alternating
Intention, emotion, and action: a neural theory based on semantic pointers.
Schröder, Tobias; Stewart, Terrence C; Thagard, Paul
2014-06-01
We propose a unified theory of intentions as neural processes that integrate representations of states of affairs, actions, and emotional evaluation. We show how this theory provides answers to philosophical questions about the concept of intention, psychological questions about human behavior, computational questions about the relations between belief and action, and neuroscientific questions about how the brain produces actions. Our theory of intention ties together biologically plausible mechanisms for belief, planning, and motor control. The computational feasibility of these mechanisms is shown by a model that simulates psychologically important cases of intention. © 2013 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
2011-07-01
WE RECOMMEND Fun Fly Stick Science Kit Fun fly stick introduces electrostatics to youngsters Special Relativity Text makes a useful addition to the study of relativity as an undergraduate LabVIEWTM 2009 Education Edition LabVIEW sets industry standard for gathering and analysing data, signal processing, instrumentation design and control, and automation and robotics Edison and Ford Winter Estates Thomas Edison's home is open to the public The Computer History Museum Take a walk through technology history at this computer museum WORTH A LOOK Fast Car Physics Book races through physics Beautiful Invisible The main subject of this book is theoretical physics Quantum Theory Cannot Hurt You A guide to physics on the large and small scale Chaos: The Science of Predictable Random Motion Book explores the mathematics behind chaotic behaviour Seven Wonders of the Universe A textual trip through the wonderful universe HANDLE WITH CARE Marie Curie: A Biography Book fails to capture Curie's science WEB WATCH Web clips to liven up science lessons
Crutchfield, James P; Ditto, William L; Sinha, Sudeshna
2010-09-01
How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.
Using a Card Trick to Illustrate Fixed Points and Stability
ERIC Educational Resources Information Center
Champanerkar, Jyoti; Jani, Mahendra
2015-01-01
Mathematical ideas from number theory, group theory, dynamical systems, and computer science have often been used to explain card tricks. Conversely, playing cards have been often used to illustrate the mathematical concepts of probability distributions and group theory. In this paper, we describe how the 21-card trick may be used to illustrate…
Identification and Addressing Reduction-Related Misconceptions
ERIC Educational Resources Information Center
Gal-Ezer, Judith; Trakhtenbrot, Mark
2016-01-01
Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract…
Assessment of Situated Learning Using Computer Environments.
ERIC Educational Resources Information Center
Young, Michael
1995-01-01
Suggests that, based on a theory of situated learning, assessment must emphasize process as much as product. Several assessment examples are given, including a computer-based planning assistant for a mathematics and science video, suggestions for computer-based portfolio assessment, and speculations about embedded assessment of virtual situations.…
Introduction to the theory of machines and languages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weidhaas, P. P.
1976-04-01
This text is intended to be an elementary ''guided tour'' through some basic concepts of modern computer science. Various models of computing machines and formal languages are studied in detail. Discussions center around questions such as, ''What is the scope of problems that can or cannot be solved by computers.''
Metocognitive Support Accelerates Computer Assisted Learning for Novice Programmers
ERIC Educational Resources Information Center
Rum, Siti Nurulain Mohd; Ismail, Maizatul Akmar
2017-01-01
Computer programming is a part of the curriculum in computer science education, and high drop rates for this subject are a universal problem. Development of metacognitive skills, including the conceptual framework provided by socio-cognitive theories that afford reflective thinking, such as actively monitoring, evaluating, and modifying one's…
Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter
2013-01-01
Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032
Retrospective Evaluation of a Collaborative LearningScience Module: The Users' Perspective
ERIC Educational Resources Information Center
DeWitt, Dorothy; Siraj, Saedah; Alias, Norlidah; Leng, Chin Hai
2013-01-01
This study focuses on the retrospective evaluation of collaborative mLearning (CmL) Science module for teaching secondary school science which was designed based on social constructivist learning theories and Merrill's First Principle of Instruction. This study is part of a developmental research in which computer-mediated communication (CMC)…
Kawano, Tomonori; Bouteau, François; Mancuso, Stefano
2012-11-01
The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed.
Kawano, Tomonori; Bouteau, François; Mancuso, Stefano
2012-01-01
The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed. PMID:23336016
How the Theory of Computing Can Help in Space Exploration
NASA Technical Reports Server (NTRS)
Kreinovich, Vladik; Longpre, Luc
1997-01-01
The opening of the NASA Pan American Center for Environmental and Earth Sciences (PACES) at the University of Texas at El Paso made it possible to organize the student Center for Theoretical Research and its Applications in Computer Science (TRACS). In this abstract, we briefly describe the main NASA-related research directions of the TRACS center, and give an overview of the preliminary results of student research.
Machine learning: Trends, perspectives, and prospects.
Jordan, M I; Mitchell, T M
2015-07-17
Machine learning addresses the question of how to build computers that improve automatically through experience. It is one of today's most rapidly growing technical fields, lying at the intersection of computer science and statistics, and at the core of artificial intelligence and data science. Recent progress in machine learning has been driven both by the development of new learning algorithms and theory and by the ongoing explosion in the availability of online data and low-cost computation. The adoption of data-intensive machine-learning methods can be found throughout science, technology and commerce, leading to more evidence-based decision-making across many walks of life, including health care, manufacturing, education, financial modeling, policing, and marketing. Copyright © 2015, American Association for the Advancement of Science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dang, Liem X.; Schenter, Gregory K.
To enhance our understanding of the solvent exchange mechanism in liquid methanol, we report a systematic study of this process using molecular dynamics simulations. We use transition state theory, the Impey-Madden-McDonald method, the reactive flux method, and Grote-Hynes theory to compute the rate constants for this process. Solvent coupling was found to dominate, resulting in a significantly small transmission coefficient. We predict a positive activation volume for the methanol exchange process. The essential features of the dynamics of the system as well as the pressure dependence are recovered from a Generalized Langevin Equation description of the dynamics. We find thatmore » the dynamics and response to anharmonicity can be decomposed into two time regimes, one corresponding to short time response (< 0.1 ps) and long time response (> 5 ps). An effective characterization of the process results from launching dynamics from the planar hypersurface corresponding to Grote-Hynes theory. This results in improved numerical convergence of correlation functions. This work was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less
Computational Science in Armenia (Invited Talk)
NASA Astrophysics Data System (ADS)
Marandjian, H.; Shoukourian, Yu.
This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.
Critical Thinking Traits of Top-Tier Experts and Implications for Computer Science Education
2007-08-01
field of cognitive theory ," [Papert 1999] used his work while developing the Logo programming language. 19 Although other researchers had developed ...of computer expert systems influenced the development of current theories dealing with cognitive abilities. One of the most important initiatives by...multitude of factors involved. He also builds on the cognitive development work of Piaget and is not ready to abandon the generalist approach. Instead, he
A Mathematical Theory of System Information Flow
2016-06-27
AFRL-AFOSR-VA-TR-2016-0232 A Mathematical Theory of System Information Flow Michael Mislove ADMINISTRATORS OF THE TULANE EDUCATIONAL FUND THE 6823...MM-YYYY) 17-06-2016 2. REPORT TYPE Final 3. DATES COVERED (From - To) 27MAR2013 - 31MAR2016 4. TITLE AND SUBTITLE A Mathematical Theory of System...systems using techniques from information theory , domain theory and other areas of mathematics and computer science. Over time, the focus shifted
NASA Astrophysics Data System (ADS)
Gobithaasan, R. U.; Miura, Kenjiro T.; Hassan, Mohamad Nor
2014-07-01
Computer Aided Geometric Design (CAGD) which surpasses the underlying theories of Computer Aided Design (CAD) and Computer Graphics (CG) has been taught in a number of Malaysian universities under the umbrella of Mathematical Sciences' faculty/department. On the other hand, CAD/CG is taught either under the Engineering or Computer Science Faculty. Even though CAGD researchers/educators/students (denoted as contributors) have been enriching this field of study by means of article/journal publication, many fail to convert the idea into constructive innovation due to the gap that occurs between CAGD contributors and practitioners (engineers/product/designers/architects/artists). This paper addresses this issue by advocating a number of technologies that can be used to transform CAGD contributors into innovators where immediate impact in terms of practical application can be experienced by the CAD/CG practitioners. The underlying principle of solving this issue is twofold. First would be to expose the CAGD contributors on ways to turn mathematical ideas into plug-ins and second is to impart relevant CAGD theories to CAD/CG to practitioners. Both cases are discussed in detail and the final section shows examples to illustrate the importance of turning mathematical knowledge into innovations.
2002-01-01
behaviors are influenced by social interactions, and to how modern IT sys- tems should be designed to support these group technical activities. The...engineering disciplines to behavior, decision, psychology, organization, and the social sciences. “Conflict manage- ment activity in collaborative...Researchers instead began to search for an entirely new paradigm, starting from a theory in social science, to construct a conceptual framework to describe
Perspective: Ring-polymer instanton theory
NASA Astrophysics Data System (ADS)
Richardson, Jeremy O.
2018-05-01
Since the earliest explorations of quantum mechanics, it has been a topic of great interest that quantum tunneling allows particles to penetrate classically insurmountable barriers. Instanton theory provides a simple description of these processes in terms of dominant tunneling pathways. Using a ring-polymer discretization, an efficient computational method is obtained for applying this theory to compute reaction rates and tunneling splittings in molecular systems. Unlike other quantum-dynamics approaches, the method scales well with the number of degrees of freedom, and for many polyatomic systems, the method may provide the most accurate predictions which can be practically computed. Instanton theory thus has the capability to produce useful data for many fields of low-temperature chemistry including spectroscopy, atmospheric and astrochemistry, as well as surface science. There is however still room for improvement in the efficiency of the numerical algorithms, and new theories are under development for describing tunneling in nonadiabatic transitions.
Closing the race and gender gaps in computer science education
NASA Astrophysics Data System (ADS)
Robinson, John Henry
Life in a technological society brings new paradigms and pressures to bear on education. These pressures are magnified for underrepresented students and must be addressed if they are to play a vital part in society. Educational pipelines need to be established to provide at risk students with the means and opportunity to succeed in science, technology, engineering, and mathematics (STEM) majors. STEM educational pipelines are programs consisting of components that seek to facilitate students' completion of a college degree by providing access to higher education, intervention, mentoring, support infrastructure, and programs that encourage academic success. Successes in the STEM professions mean that more educators, scientist, engineers, and researchers will be available to add diversity to the professions and to provide role models for future generations. The issues that the educational pipelines must address are improving at risk groups' perceptions and awareness of the math, science, and engineering professions. Additionally, the educational pipelines must provide intervention in math preparation, overcome gender and race socialization, and provide mentors and counseling to help students achieve better self perceptions and provide positive role models. This study was designed to explorer the underrepresentation of minorities and women in the computer science major at Rowan University through a multilayered action research methodology. The purpose of this research study was to define and understand the needs of underrepresented students in computer science, to examine current policies and enrollment data for Rowan University, to develop a historical profile of the Computer Science program from the standpoint of ethnicity and gender enrollment to ascertain trends in students' choice of computer science as a major, and an attempt to determine if raising awareness about computer science for incoming freshmen, and providing an alternate route into the computer science major will entice more women and minorities to pursue a degree in computer science at Rowan University. Finally, this study examined my espoused leadership theories and my leadership theories in use through reflective practices as I progressed through the cycles of this project. The outcomes of this study indicated a large downward trend in women enrollment in computer science and a relatively flat trend in minority enrollment. The enrollment data at Rowan University was found to follow a nationwide trend for underrepresented students' enrollment in STEM majors. The study also indicated that students' mental models are based upon their race and gender socialization and their understanding of the world and society. The mental models were shown to play a large role in the students' choice of major. Finally, a computer science pipeline was designed and piloted as part of this study in an attempt to entice more students into the major and facilitate their success. Additionally, the mental models of the participants were challenged through interactions to make them aware of what possibilities are available with a degree in computer science. The entire study was wrapped in my leadership, which was practiced and studied over the course of this work.
Perceptions of teaching and learning automata theory in a college-level computer science course
NASA Astrophysics Data System (ADS)
Weidmann, Phoebe Kay
This dissertation identifies and describes student and instructor perceptions that contribute to effective teaching and learning of Automata Theory in a competitive college-level Computer Science program. Effective teaching is the ability to create an appropriate learning environment in order to provide effective learning. We define effective learning as the ability of a student to meet instructor set learning objectives, demonstrating this by passing the course, while reporting a good learning experience. We conducted our investigation through a detailed qualitative case study of two sections (118 students) of Automata Theory (CS 341) at The University of Texas at Austin taught by Dr. Lily Quilt. Because Automata Theory has a fixed curriculum in the sense that many curricula and textbooks agree on what Automata Theory contains, differences being depth and amount of material to cover in a single course, a case study would allow for generalizable findings. Automata Theory is especially problematic in a Computer Science curriculum since students are not experienced in abstract thinking before taking this course, fail to understand the relevance of the theory, and prefer classes with more concrete activities such as programming. This creates a special challenge for any instructor of Automata Theory as motivation becomes critical for student learning. Through the use of student surveys, instructor interviews, classroom observation, material and course grade analysis we sought to understand what students perceived, what instructors expected of students, and how those perceptions played out in the classroom in terms of structure and instruction. Our goal was to create suggestions that would lead to a better designed course and thus a higher student success rate in Automata Theory. We created a unique theoretical basis, pedagogical positivism, on which to study college-level courses. Pedagogical positivism states that through examining instructor and student perceptions of teaching and learning, improvements to a course are possible. These improvements can eventually develop a "best practice" instructional environment. This view is not possible under a strictly constructivist learning theory as there is no way to teach a group of individuals in a "best" way. Using this theoretical basis, we examined the gathered data from CS 341. (Abstract shortened by UMI.)
ERIC Educational Resources Information Center
Velez-Rubio, Miguel
2013-01-01
Teaching computer programming to freshmen students in Computer Sciences and other Information Technology areas has been identified as a complex activity. Different approaches have been studied looking for the best one that could help to improve this teaching process. A proposed approach was implemented which is based in the language immersion…
Library Theory and Research Section. Education and Research Division. Papers.
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on library/information science theory and research, which were presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "The Role of the Library in Computer-Aided Information and Documentation Systems," in which Wolf D. Rauch (West Germany) asserts that libraries must adapt to the…
Semantics vs. World Knowledge in Prefrontal Cortex
ERIC Educational Resources Information Center
Pylkkanen, Liina; Oliveri, Bridget; Smart, Andrew J.
2009-01-01
Humans have knowledge about the properties of their native language at various levels of representation; sound, structure, and meaning computation constitute the core components of any linguistic theory. Although the brain sciences have engaged with representational theories of sound and syntactic structure, the study of the neural bases of…
ERIC Educational Resources Information Center
Bergerson, Peter J., Ed.
The 16 chapters of this book offer innovative instructional techniques used to train public managers. It presents public management concepts along with such subtopics as organizational theory and ethics, research skills, program evaluation, financial management, computers and communication skills in public administration, comparative public…
Using Ontologies for Knowledge Management: An Information Systems Perspective.
ERIC Educational Resources Information Center
Jurisica, Igor; Mylopoulos, John; Yu, Eric
1999-01-01
Surveys some of the basic concepts that have been used in computer science for the representation of knowledge and summarizes some of their advantages and drawbacks. Relates these techniques to information sciences theory and practice. Concepts are classified in four broad ontological categories: static ontology, dynamic ontology, intentional…
Materials Science Research | Materials Science | NREL
Structure Theory We use high-performance computing to design and discover materials for energy, and to study structure of surfaces and critical interfaces. Images of red and yellow particles Materials Discovery Our by traditional targeted experiments. Photo of a stainless steel piece of equipment with multiple
NASA Astrophysics Data System (ADS)
Lin, Feng; Chan, Carol K. K.
2018-04-01
This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a unit on electricity; they also reflected on the epistemic nature of their discourse. A comparison class of 22 students, taught by the same teacher, studied the same unit using the school's established scientific investigation method. We hypothesised that engaging students in idea-driven and theory-building discourse, as well as scaffolding them to reflect on the epistemic nature of their discourse, would help them understand their own scientific collaborative discourse as a theory-building process, and therefore understand scientific inquiry as an idea-driven and theory-building process. As hypothesised, we found that students engaged in knowledge-building discourse and reflection outperformed comparison students in scientific epistemology and science learning, and that students' understanding of collaborative discourse predicted their post-test scientific epistemology and science learning. To further understand the epistemic change process among knowledge-building students, we analysed their KF discourse to understand whether and how their epistemic practice had changed after epistemic reflection. The implications on ways of promoting epistemic change are discussed.
Sign use and cognition in automated scientific discovery: are computers only special kinds of signs?
NASA Astrophysics Data System (ADS)
Giza, Piotr
2018-04-01
James Fetzer criticizes the computational paradigm, prevailing in cognitive science by questioning, what he takes to be, its most elementary ingredient: that cognition is computation across representations. He argues that if cognition is taken to be a purposive, meaningful, algorithmic problem solving activity, then computers are incapable of cognition. Instead, they appear to be signs of a special kind, that can facilitate computation. He proposes the conception of minds as semiotic systems as an alternative paradigm for understanding mental phenomena, one that seems to overcome the difficulties of computationalism. Now, I argue, that with computer systems dealing with scientific discovery, the matter is not so simple as that. The alleged superiority of humans using signs to stand for something other over computers being merely "physical symbol systems" or "automatic formal systems" is only easy to establish in everyday life, but becomes far from obvious when scientific discovery is at stake. In science, as opposed to everyday life, the meaning of symbols is, apart from very low-level experimental investigations, defined implicitly by the way the symbols are used in explanatory theories or experimental laws relevant to the field, and in consequence, human and machine discoverers are much more on a par. Moreover, the great practical success of the genetic programming method and recent attempts to apply it to automatic generation of cognitive theories seem to show, that computer systems are capable of very efficient problem solving activity in science, which is neither purposive nor meaningful, nor algorithmic. This, I think, undermines Fetzer's argument that computer systems are incapable of cognition because computation across representations is bound to be a purposive, meaningful, algorithmic problem solving activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dress, W.B.
Rosen's modeling relation is embedded in Popper's three worlds to provide an heuristic tool for model building and a guide for thinking about complex systems. The utility of this construct is demonstrated by suggesting a solution to the problem of pseudo science and a resolution of the famous Bohr-Einstein debates. A theory of bizarre systems is presented by an analogy with entangled particles of quantum mechanics. This theory underscores the poverty of present-day computational systems (e.g., computers) for creating complex and bizarre entities by distinguishing between mechanism and organism.
Evaluation of an Educational Computer Programme as a Change Agent in Science Classrooms
NASA Astrophysics Data System (ADS)
Muwanga-Zake, Johnnie Wycliffe Frank
2007-12-01
I report on benefits from 26 teacher-participant evaluators of a computer game designed to motivate learning and to ease conceptual understanding of biology in South Africa. Using a developmental, social constructivist and interpretative model, the recommendation is to include the value systems and needs of end-users (through social dialogue); curriculum issues (learning theories in the ECP and those the education authorities recommend, as well as ECP-curriculum integration); the nature of the subject the ECP presents (e.g., Nature of Science); and the compatibility of the ECP with school computers.
The Concept of Nondeterminism: Its Development and Implications for Teaching
ERIC Educational Resources Information Center
Armoni, Michal; Ben-Ari, Mordechai
2009-01-01
Nondeterminism is a fundamental concept in computer science that appears in various contexts such as automata theory, algorithms and concurrent computation. We present a taxonomy of the different ways that nondeterminism can be defined and used; the categories of the taxonomy are domain, nature, implementation, consistency, execution and…
Elementary Teachers' Simulation Adoption and Inquiry-Based Use Following Professional Development
ERIC Educational Resources Information Center
Gonczi, Amanda; Maeng, Jennifer; Bell, Randy
2017-01-01
The purpose of this study was to characterize and compare 64 elementary science teachers' computer simulation use prior to and following professional development (PD) aligned with Innovation Adoption Theory. The PD highlighted computer simulation affordances that elementary teachers might find particularly useful. Qualitative and quantitative…
The Computational and Neural Basis of Cognitive Control: Charted Territory and New Frontiers
ERIC Educational Resources Information Center
Botvinick, Matthew M.; Cohen, Jonathan D.
2014-01-01
Cognitive control has long been one of the most active areas of computational modeling work in cognitive science. The focus on computational models as a medium for specifying and developing theory predates the PDP books, and cognitive control was not one of the areas on which they focused. However, the framework they provided has injected work on…
How Robotics Programs Influence Young Women's Career Choices: A Grounded Theory Model
ERIC Educational Resources Information Center
Craig, Cecilia Dosh-Bluhm
2014-01-01
The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced…
Conceptual strategies and inter-theory relations: The case of nanoscale cracks
NASA Astrophysics Data System (ADS)
Bursten, Julia R.
2018-05-01
This paper introduces a new account of inter-theory relations in physics, which I call the conceptual strategies account. Using the example of a multiscale computer simulation model of nanoscale crack propagation in silicon, I illustrate this account and contrast it with existing reductive, emergent, and handshaking approaches. The conceptual strategies account develops the notion that relations among physical theories, and among their models, are constrained but not dictated by limitations from physics, mathematics, and computation, and that conceptual reasoning within those limits is required both to generate and to understand the relations between theories. Conceptual strategies result in a variety of types of relations between theories and models. These relations are themselves epistemic objects, like theories and models, and as such are an under-recognized part of the epistemic landscape of science.
Parallel Distributed Processing Theory in the Age of Deep Networks.
Bowers, Jeffrey S
2017-12-01
Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Pedersen, Morten Gram
2018-03-01
Methods from network theory are increasingly used in research spanning from engineering and computer science to psychology and the social sciences. In this issue, Gosak et al. [1] provide a thorough review of network science applications to biological systems ranging from the subcellular world via neuroscience to ecosystems, with special attention to the insulin-secreting beta-cells in pancreatic islets.
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Srivastava, Akanksha
2013-01-01
This paper presents a survey of innovative approaches of the most effective computational techniques for solving singular perturbed partial differential equations, which are useful because of their numerical and computer realizations. Many applied problems appearing in semiconductors theory, biochemistry, kinetics, theory of electrical chains, economics, solid mechanics, fluid dynamics, quantum mechanics, and many others can be modelled as singularly perturbed systems. Here, we summarize a wide range of research articles published by numerous researchers during the last ten years to get a better view of the present scenario in this area of research.
A Comprehensive Theory of Algorithms for Wireless Networks and Mobile Systems
2016-06-08
David Peleg. Nonuniform SINR+Voronoi Diagrams are Effectively Uniform. In Yoram Moses, editor, Distributed Computing: 29th International Symposium...in Computer Science, page 559. Springer, 2014. [16] Erez Kantor, Zvi Lotker, Merav Parter, and David Peleg. Nonuniform sINR+Voronoi dia- grams are...Merav Parter, and David Peleg. Nonuniform SINR+Voronoi diagrams are effectively uniform. In Yoram Moses, editor, Distributed Computing - 29th
Scientific Visualization and Computational Science: Natural Partners
NASA Technical Reports Server (NTRS)
Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)
1995-01-01
Scientific visualization is developing rapidly, stimulated by computational science, which is gaining acceptance as a third alternative to theory and experiment. Computational science is based on numerical simulations of mathematical models derived from theory. But each individual simulation is like a hypothetical experiment; initial conditions are specified, and the result is a record of the observed conditions. Experiments can be simulated for situations that can not really be created or controlled. Results impossible to measure can be computed.. Even for observable values, computed samples are typically much denser. Numerical simulations also extend scientific exploration where the mathematics is analytically intractable. Numerical simulations are used to study phenomena from subatomic to intergalactic scales and from abstract mathematical structures to pragmatic engineering of everyday objects. But computational science methods would be almost useless without visualization. The obvious reason is that the huge amounts of data produced require the high bandwidth of the human visual system, and interactivity adds to the power. Visualization systems also provide a single context for all the activities involved from debugging the simulations, to exploring the data, to communicating the results. Most of the presentations today have their roots in image processing, where the fundamental task is: Given an image, extract information about the scene. Visualization has developed from computer graphics, and the inverse task: Given a scene description, make an image. Visualization extends the graphics paradigm by expanding the possible input. The goal is still to produce images; the difficulty is that the input is not a scene description displayable by standard graphics methods. Visualization techniques must either transform the data into a scene description or extend graphics techniques to display this odd input. Computational science is a fertile field for visualization research because the results vary so widely and include things that have no known appearance. The amount of data creates additional challenges for both hardware and software systems. Evaluations of visualization should ultimately reflect the insight gained into the scientific phenomena. So making good visualizations requires consideration of characteristics of the user and the purpose of the visualization. Knowledge about human perception and graphic design is also relevant. It is this breadth of knowledge that stimulates proposals for multidisciplinary visualization teams and intelligent visualization assistant software. Visualization is an immature field, but computational science is stimulating research on a broad front.
NASA Astrophysics Data System (ADS)
Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil
Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.
Reaction Rate Theory in Coordination Number Space: An Application to Ion Solvation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy, Santanu; Baer, Marcel D.; Mundy, Christopher J.
2016-04-14
Understanding reaction mechanisms in many chemical and biological processes require application of rare event theories. In these theories, an effective choice of a reaction coordinate to describe a reaction pathway is essential. To this end, we study ion solvation in water using molecular dynamics simulations and explore the utility of coordination number (n = number of water molecules in the first solvation shell) as the reaction coordinate. Here we compute the potential of mean force (W(n)) using umbrella sampling, predicting multiple metastable n-states for both cations and anions. We find with increasing ionic size, these states become more stable andmore » structured for cations when compared to anions. We have extended transition state theory (TST) to calculate transition rates between n-states. TST overestimates the rate constant due to solvent-induced barrier recrossings that are not accounted for. We correct the TST rates by calculating transmission coefficients using the reactive flux method. This approach enables a new way of understanding rare events involving coordination complexes. We gratefully acknowledge Liem Dang and Panos Stinis for useful discussion. This research used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. SR, CJM, and GKS were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). PNNL is a multiprogram national laboratory operated by Battelle for the U.S. Department of Energy.« less
ERIC Educational Resources Information Center
Orey, Michael A.; Nelson, Wayne A.
Arguing that the evolution of intelligent tutoring systems better reflects the recent theoretical developments of cognitive science than traditional computer-based instruction (CBI), this paper describes a general model for an intelligent tutoring system and suggests ways to improve CBI using design principles derived from research in cognitive…
Opportunities for Research on the Organizational Impact of School Computers. Technical-Report-No. 7.
ERIC Educational Resources Information Center
Newman, Denis
As computers are acquired in greater numbers in schools, their impact on the social organization of instruction increasingly becomes an issue for research. Developments in the cognitive science of instruction, drawing on sociohistorical theory, provide researchers with an appropriate theoretical approach to cultural tools and cognitive change,…
Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level
ERIC Educational Resources Information Center
Christiansen, Henning
2004-01-01
Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural…
ERIC Educational Resources Information Center
Wiske, Martha Stone; And Others
Twin aims--to advance theory and to improve practice in science, mathematics, and computing education--guided the Educational Technology Center's (ETC) research from its inception in 1983. These aims led ETC to establish collaborative research groups in which people whose primary interest was classroom teaching and learning, and researchers…
Investigating the Effectiveness of Computer Simulations for Chemistry Learning
ERIC Educational Resources Information Center
Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan
2012-01-01
Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…
Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments
NASA Astrophysics Data System (ADS)
Lane, Peter C. R.; Gobet, Fernand
2013-03-01
Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.
NASA Astrophysics Data System (ADS)
Binti Shamsuddin, Norsila
Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.
Evangelopoulos, Nicholas E
2013-11-01
This article reviews latent semantic analysis (LSA), a theory of meaning as well as a method for extracting that meaning from passages of text, based on statistical computations over a collection of documents. LSA as a theory of meaning defines a latent semantic space where documents and individual words are represented as vectors. LSA as a computational technique uses linear algebra to extract dimensions that represent that space. This representation enables the computation of similarity among terms and documents, categorization of terms and documents, and summarization of large collections of documents using automated procedures that mimic the way humans perform similar cognitive tasks. We present some technical details, various illustrative examples, and discuss a number of applications from linguistics, psychology, cognitive science, education, information science, and analysis of textual data in general. WIREs Cogn Sci 2013, 4:683-692. doi: 10.1002/wcs.1254 CONFLICT OF INTEREST: The author has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Fitch, W. Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology.
Fitch, W Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology. Copyright © 2014. Published by Elsevier B.V.
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
1992-08-31
Department of Mathematics Dept of Computer Science Dept of Computer Science St Lucia Old 4067 2145 Sheridan Rd 2145 Sheridan Rd Australia Evanston IL...Tscng. University 6f Washington, CmaaieSuyo tcatcApoia CP8/ old oastRoomI jtiomi Algor~tihis in the Multivariate kiefer-ý 8Global Cos Rm Stablity of the...has been considerable recent activity in con- W. Li and J. Swetits. Old Dominion 1:30/Regency A/B srutig pocedures to be used with interior-point
NASA Astrophysics Data System (ADS)
Frenkel, Daan
2007-03-01
During the past decade there has been a unique synergy between theory, experiment and simulation in Soft Matter Physics. In colloid science, computer simulations that started out as studies of highly simplified model systems, have acquired direct experimental relevance because experimental realizations of these simple models can now be synthesized. Whilst many numerical predictions concerning the phase behavior of colloidal systems have been vindicated by experiments, the jury is still out on others. In my talk I will discuss some of the recent technical developments, new findings and open questions in computational soft-matter science.
Leon Cooper, Cooper Pairs, and the BCS Theory
, psychology, mathematics, engineering, physics, linguistics and computer science. An Institute objective is to pave the way for the next generation of cognitive pharmaceuticals and intelligent systems for use in
A Decomposition Theorem for Finite Automata.
ERIC Educational Resources Information Center
Santa Coloma, Teresa L.; Tucci, Ralph P.
1990-01-01
Described is automata theory which is a branch of theoretical computer science. A decomposition theorem is presented that is easier than the Krohn-Rhodes theorem. Included are the definitions, the theorem, and a proof. (KR)
Modernization (Selected Articles),
1986-09-18
newly developed science such as control theory, artificial intelligence, model identification, computer and microelectronics technology, graphic...five "top guns" from around the country specializing in intellignece , mechanics, software and hardware as our technical advisors. In addition
2004-06-01
Department of EE and Computer Science University of Michigan Ann Arbor, MI 48109, USA pollackm@eecs.umich.edu Sujata Banerjeez Info. Sci. & Telecom. Dept...University of Pittsburgh Pittsburgh, PA 15260, USA sujata @tele.pitt.edu Abstract An important aspect of Business to Business E- Commerce is the agile
Higher Inductive Types as Homotopy-Initial Algebras
2016-08-01
Higher Inductive Types as Homotopy-Initial Algebras Kristina Sojakova CMU-CS-16-125 August 2016 School of Computer Science Carnegie Mellon University...talk at the Workshop on Logic, Language, Information and Computation (WoLLIC 2011). 1, 2.1 [38] M. Warren. Homotopy-Theoretic Aspects of Constructive Type Theory. PhD thesis, Carnegie Mellon University, 2008. 1 143
ERIC Educational Resources Information Center
Suppes, P.; And Others
From some simple and schematic assumptions about information processing, a stochastic differential equation is derived for the motion of a student through a computer-assisted elementary mathematics curriculum. The mathematics strands curriculum of the Institute for Mathematical Studies in the Social Sciences is used to test: (1) the theory and (2)…
ERIC Educational Resources Information Center
Carrejo, David; Robertson, William H.
2011-01-01
Computer-based mathematical modeling in physics is a process of constructing models of concepts and the relationships between them in the scientific characteristics of work. In this manner, computer-based modeling integrates the interactions of natural phenomenon through the use of models, which provide structure for theories and a base for…
ERIC Educational Resources Information Center
Celedón-Pattichis, Sylvia; LópezLeiva, Carlos Alfonso; Pattichis, Marios S.; Llamocca, Daniel
2013-01-01
There is a strong need in the United States to increase the number of students from underrepresented groups who pursue careers in Science, Technology, Engineering, and Mathematics. Drawing from sociocultural theory, we present approaches to establishing collaborations between computer engineering and mathematics/bilingual education faculty to…
ERIC Educational Resources Information Center
Turcotte, Sandrine
2012-01-01
This article describes in detail a conversation analysis of conceptual change in a computer-supported collaborative learning environment. Conceptual change is an essential learning process in science education that has yet to be fully understood. While many models and theories have been developed over the last three decades, empirical data to…
Materials science: Chemistry and physics happily wed
NASA Astrophysics Data System (ADS)
Fiete, Gregory A.
2017-07-01
A major advance in the quantum theory of solids allows materials to be identified whose electronic states have a non-trivial topology. Such materials could have many computing and electronics applications. See Article p.298
Project : semi-autonomous parking for enhanced safety and efficiency.
DOT National Transportation Integrated Search
2016-04-01
Index coding, a coding formulation traditionally analyzed in the theoretical computer science and : information theory communities, has received considerable attention in recent years due to its value in : wireless communications and networking probl...
A Revision of Learning and Teaching = Revision del aprender y del ensenar.
ERIC Educational Resources Information Center
Reggini, Horace C.
1983-01-01
This review of the findings of recent cognitive science research pertaining to learning and teaching focuses on how science and mathematics are being taught, analyzes how the presence of the computer demonstrates a need for radical rethinking of both the theory and the practice of learning, and points out that if educators fail to consider the…
ERIC Educational Resources Information Center
Yoon, Susan A.; Anderson, Emma; Koehler-Yom, Jessica; Evans, Chad; Park, Miyoung; Sheldon, Josh; Schoenfeld, Ilana; Wendel, Daniel; Scheintaub, Hal; Klopfer, Eric
2017-01-01
The recent next generation science standards in the United States have emphasized learning about complex systems as a core feature of science learning. Over the past 15 years, a number of educational tools and theories have been investigated to help students learn about complex systems; but surprisingly, little research has been devoted to…
Quantifying uncertainty in climate change science through empirical information theory.
Majda, Andrew J; Gershgorin, Boris
2010-08-24
Quantifying the uncertainty for the present climate and the predictions of climate change in the suite of imperfect Atmosphere Ocean Science (AOS) computer models is a central issue in climate change science. Here, a systematic approach to these issues with firm mathematical underpinning is developed through empirical information theory. An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as covariance ratios in a transformation invariant fashion. The subtle behavior of model errors with this information metric is quantified in an instructive statistically exactly solvable test model with direct relevance to climate change science including the prototype behavior of tracer gases such as CO(2). Formulas for identifying the most sensitive climate change directions using statistics of the present climate or an AOS model approximation are developed here; these formulas just involve finding the eigenvector associated with the largest eigenvalue of a quadratic form computed through suitable unperturbed climate statistics. These climate change concepts are illustrated on a statistically exactly solvable one-dimensional stochastic model with relevance for low frequency variability of the atmosphere. Viable algorithms for implementation of these concepts are discussed throughout the paper.
Moreno-Díaz, Roberto; Moreno-Díaz, Arminda
2013-06-01
This paper explores the origins and content of neurocybernetics and its links to artificial intelligence, computer science and knowledge engineering. Starting with three remarkable pieces of work, we center attention on a number of events that initiated and developed basic topics that are still nowadays a matter of research and inquire, from goal directed activity theories to circular causality and to reverberations and learning. Within this context, we pay tribute to the memory of Prof. Ricciardi documenting the importance of his contributions in the mathematics of brain, neural nets and neurophysiological models, computational simulations and techniques. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Why formal learning theory matters for cognitive science.
Fulop, Sean; Chater, Nick
2013-01-01
This article reviews a number of different areas in the foundations of formal learning theory. After outlining the general framework for formal models of learning, the Bayesian approach to learning is summarized. This leads to a discussion of Solomonoff's Universal Prior Distribution for Bayesian learning. Gold's model of identification in the limit is also outlined. We next discuss a number of aspects of learning theory raised in contributed papers, related to both computational and representational complexity. The article concludes with a description of how semi-supervised learning can be applied to the study of cognitive learning models. Throughout this overview, the specific points raised by our contributing authors are connected to the models and methods under review. Copyright © 2013 Cognitive Science Society, Inc.
Can computational goals inform theories of vision?
Anderson, Barton L
2015-04-01
One of the most lasting contributions of Marr's posthumous book is his articulation of the different "levels of analysis" that are needed to understand vision. Although a variety of work has examined how these different levels are related, there is comparatively little examination of the assumptions on which his proposed levels rest, or the plausibility of the approach Marr articulated given those assumptions. Marr placed particular significance on computational level theory, which specifies the "goal" of a computation, its appropriateness for solving a particular problem, and the logic by which it can be carried out. The structure of computational level theory is inherently teleological: What the brain does is described in terms of its purpose. I argue that computational level theory, and the reverse-engineering approach it inspires, requires understanding the historical trajectory that gave rise to functional capacities that can be meaningfully attributed with some sense of purpose or goal, that is, a reconstruction of the fitness function on which natural selection acted in shaping our visual abilities. I argue that this reconstruction is required to distinguish abilities shaped by natural selection-"natural tasks" -from evolutionary "by-products" (spandrels, co-optations, and exaptations), rather than merely demonstrating that computational goals can be embedded in a Bayesian model that renders a particular behavior or process rational. Copyright © 2015 Cognitive Science Society, Inc.
Brier, Søren
2017-12-01
Charles S. Peirce developed a process philosophy featuring a non-theistic agapistic evolution from nothingness. It is an Eastern inspired alternative to the Western mechanical ontology of classical science also inspired by the American transcendentalists. Advaitism and Buddhism are the two most important Eastern philosophical traditions that encompass scientific knowledge and the idea of spontaneous evolutionary development. This article attempts to show how Peirce's non-mechanistic triadic semiotic process theory is suited better to embrace the quantum field view than mechanistic and information-based views are with regard to a theory of the emergence of consciousness. Peirce views the universe as a reasoning process developing from pure potentiality to the fully ordered rational Summon Bonum. The paper compares this with John Archibald Wheeler's "It from bit" cosmogony based on quantum information science, which leads to the info-computational view of nature, mind and culture. However, this theory lacks a phenomenological foundation. David Chalmers' double aspect interpretation of information attempts to overcome the limitations of the info-computational view. Chalmers supplements Batesonian and Wheelerian info-computationalism - both of which lack a phenomenological aspect - with a dimension that corresponds to the phenomenological aspect of reality. However, he does not manage to produce an integrated theory of the development of meaning and rationality. Alex Hankey's further work goes some way towards establishing a theory that can satisfy Husserl's criteria for consciousness - such as a sense of being and time - but Hankey's dependence on Chalmers' theory is still not able to account for what the connection between core consciousness and the physical world is. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantum field theory and coalgebraic logic in theoretical computer science.
Basti, Gianfranco; Capolupo, Antonio; Vitiello, Giuseppe
2017-11-01
We suggest that in the framework of the Category Theory it is possible to demonstrate the mathematical and logical dual equivalence between the category of the q-deformed Hopf Coalgebras and the category of the q-deformed Hopf Algebras in quantum field theory (QFT), interpreted as a thermal field theory. Each pair algebra-coalgebra characterizes a QFT system and its mirroring thermal bath, respectively, so to model dissipative quantum systems in far-from-equilibrium conditions, with an evident significance also for biological sciences. Our study is in fact inspired by applications to neuroscience where the brain memory capacity, for instance, has been modeled by using the QFT unitarily inequivalent representations. The q-deformed Hopf Coalgebras and the q-deformed Hopf Algebras constitute two dual categories because characterized by the same functor T, related with the Bogoliubov transform, and by its contravariant application T op , respectively. The q-deformation parameter is related to the Bogoliubov angle, and it is effectively a thermal parameter. Therefore, the different values of q identify univocally, and label the vacua appearing in the foliation process of the quantum vacuum. This means that, in the framework of Universal Coalgebra, as general theory of dynamic and computing systems ("labelled state-transition systems"), the so labelled infinitely many quantum vacua can be interpreted as the Final Coalgebra of an "Infinite State Black-Box Machine". All this opens the way to the possibility of designing a new class of universal quantum computing architectures based on this coalgebraic QFT formulation, as its ability of naturally generating a Fibonacci progression demonstrates. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rate Theory of Ion Pairing at the Water Liquid–Vapor Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dang, Liem X.; Schenter, Gregory K.; Wick, Collin D.
There is overwhelming evidence that certain ions are present near the vapor–liquid interface of aqueous salt solutions. Despite their importance in many chemical reactive phenomena, how ion–ion interactions are affected by interfaces and their influence on kinetic processes is not well understood. Molecular simulations were carried out to exam the thermodynamics and kinetics of small alkali halide ions in the bulk and near the water vapor–liquid interface. We calculated dissociation rates using classical transition state theory, and corrected them with transmission coefficients determined by the reactive flux method and Grote-Hynes theory. Our results show that, in addition to affecting themore » free energy of ions in solution, the interfacial environments significantly influence the kinetics of ion pairing. The results obtained from the reactive flux method and Grote-Hynes theory on the relaxation time present an unequivocal picture of the interface suppressing ion dissociation. This work was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less
Emotion-affected decision making in human simulation.
Zhao, Y; Kang, J; Wright, D K
2006-01-01
Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the emotion process with decision making. A computational emotion model is proposed, and the initial framework of this model in virtual human simulation within the platform of Virtools is presented.
Information Processing Research
1979-06-01
quantitative shape recovery. For the qualitative shape recovery we use a model of the Origami world (Kanade, 1978), together with edge profiles of...Workshop. Carnegie-Mellon University, Computer Science Department, Pittsburgh, PA, November, 1978. Kanade, T. A theory of origami world. Technical
The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.
Jobe, Thomas H.; Helgason, Cathy M.
1998-04-01
Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.
Mathematical and Computational Challenges in Population Biology and Ecosystems Science
NASA Technical Reports Server (NTRS)
Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.
1997-01-01
Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.
The Use of Fuzzy Theory in Grading of Students in Math
ERIC Educational Resources Information Center
Bjelica, Momcilo; Rankovic, Dragica
2010-01-01
The development of computer science, statistics and other technological fields, give us more opportunities to improve the process of evaluation of degree of knowledge and achievements in a learning process of our students. More and more we are relying on the computer software to guide us in the grading process. An improved way of grading can help…
ERIC Educational Resources Information Center
Mallios, Nikolaos; Vassilakopoulos, Michael Gr.
2015-01-01
One of the most intriguing objectives when teaching computer science in mid-adolescence high school students is attracting and mainly maintaining their concentration within the limits of the class. A number of theories have been proposed and numerous methodologies have been applied, aiming to assist in the implementation of a personalized learning…
JPRS Report, Science & Technology, USSR: Computers, Control Systems and Machines
1989-03-14
optimizatsii slozhnykh sistem (Coding Theory and Complex System Optimization ). Alma-Ata, Nauka Press, 1977, pp. 8-16. 11. Author’s certificate number...Interpreter Specifics [0. I. Amvrosova] ............................................. 141 Creation of Modern Computer Systems for Complex Ecological...processor can be designed to decrease degradation upon failure and assure more reliable processor operation, without requiring more complex software or
Logic as Marr's Computational Level: Four Case Studies.
Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter
2015-04-01
We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Weitao
This Special Topic Issue on the Advances in Density Functional Theory, published as a celebration of the fifty years of density functional theory, contains a retrospective article, a perspective article, and a collection of original research articles that showcase recent theoretical advances in the field. It provides a timely discussion reflecting a cross section of our understanding, and the theoretical and computational developments, which have significant implications in broad areas of sciences and engineering.
Density functional theory in materials science.
Neugebauer, Jörg; Hickel, Tilmann
2013-09-01
Materials science is a highly interdisciplinary field. It is devoted to the understanding of the relationship between (a) fundamental physical and chemical properties governing processes at the atomistic scale with (b) typically macroscopic properties required of materials in engineering applications. For many materials, this relationship is not only determined by chemical composition, but strongly governed by microstructure. The latter is a consequence of carefully selected process conditions (e.g., mechanical forming and annealing in metallurgy or epitaxial growth in semiconductor technology). A key task of computational materials science is to unravel the often hidden composition-structure-property relationships using computational techniques. The present paper does not aim to give a complete review of all aspects of materials science. Rather, we will present the key concepts underlying the computation of selected material properties and discuss the major classes of materials to which they are applied. Specifically, our focus will be on methods used to describe single or polycrystalline bulk materials of semiconductor, metal or ceramic form.
NASA Astrophysics Data System (ADS)
Wang, Jianxiong
2014-06-01
This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF
CSM research: Methods and application studies
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
1989-01-01
Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.
Towards systemic theories in biological psychiatry.
Bender, W; Albus, M; Möller, H-J; Tretter, F
2006-02-01
Although still rather controversial, empirical data on the neurobiology of schizophrenia have reached a degree of complexity that makes it hard to obtain a coherent picture of the malfunctions of the brain in schizophrenia. Theoretical neuropsychiatry should therefore use the tools of theoretical sciences like cybernetics, informatics, computational neuroscience or systems science. The methodology of systems science permits the modeling of complex dynamic nonlinear systems. Such procedures might help us to understand brain functions and the disorders and actions of psychiatric drugs better.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less
Applications of large-scale density functional theory in biology
NASA Astrophysics Data System (ADS)
Cole, Daniel J.; Hine, Nicholas D. M.
2016-10-01
Density functional theory (DFT) has become a routine tool for the computation of electronic structure in the physics, materials and chemistry fields. Yet the application of traditional DFT to problems in the biological sciences is hindered, to a large extent, by the unfavourable scaling of the computational effort with system size. Here, we review some of the major software and functionality advances that enable insightful electronic structure calculations to be performed on systems comprising many thousands of atoms. We describe some of the early applications of large-scale DFT to the computation of the electronic properties and structure of biomolecules, as well as to paradigmatic problems in enzymology, metalloproteins, photosynthesis and computer-aided drug design. With this review, we hope to demonstrate that first principles modelling of biological structure-function relationships are approaching a reality.
Designing, programming, and optimizing a (small) quantum computer
NASA Astrophysics Data System (ADS)
Svore, Krysta
In 1982, Richard Feynman proposed to use a computer founded on the laws of quantum physics to simulate physical systems. In the more than thirty years since, quantum computers have shown promise to solve problems in number theory, chemistry, and materials science that would otherwise take longer than the lifetime of the universe to solve on an exascale classical machine. The practical realization of a quantum computer requires understanding and manipulating subtle quantum states while experimentally controlling quantum interference. It also requires an end-to-end software architecture for programming, optimizing, and implementing a quantum algorithm on the quantum device hardware. In this talk, we will introduce recent advances in connecting abstract theory to present-day real-world applications through software. We will highlight recent advancement of quantum algorithms and the challenges in ultimately performing a scalable solution on a quantum device.
2011-04-08
into how economics, information theory and computer science, psychology, sociology, evolutionary biology, physics (quantum mechanics) and cosmology ...include knowledge and definition of “self” (as “self” is part of the environment) and the shared experience and perspective of others That...including information, entropy, quantum behavior, and cosmological progress In short I assume the above and therefore my recommendations could be
Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library Visual Media Services Timeline History High-Energy Physics Accelerator Science in Medicine Follow
NASA Astrophysics Data System (ADS)
Jalili, Mahdi
2018-03-01
I enjoyed reading Gosak et al. review on analysing biological systems from network science perspective [1]. Network science, first started within Physics community, is now a mature multidisciplinary field of science with many applications ranging from Ecology to biology, medicine, social sciences, engineering and computer science. Gosak et al. discussed how biological systems can be modelled and described by complex network theory which is an important application of network science. Although there has been considerable progress in network biology over the past two decades, this is just the beginning and network science has a great deal to offer to biology and medical sciences.
Quantum Approach to Informatics
NASA Astrophysics Data System (ADS)
Stenholm, Stig; Suominen, Kalle-Antti
2005-08-01
An essential overview of quantum information Information, whether inscribed as a mark on a stone tablet or encoded as a magnetic domain on a hard drive, must be stored in a physical object and thus made subject to the laws of physics. Traditionally, information processing such as computation occurred in a framework governed by laws of classical physics. However, information can also be stored and processed using the states of matter described by non-classical quantum theory. Understanding this quantum information, a fundamentally different type of information, has been a major project of physicists and information theorists in recent years, and recent experimental research has started to yield promising results. Quantum Approach to Informatics fills the need for a concise introduction to this burgeoning new field, offering an intuitive approach for readers in both the physics and information science communities, as well as in related fields. Only a basic background in quantum theory is required, and the text keeps the focus on bringing this theory to bear on contemporary informatics. Instead of proofs and other highly formal structures, detailed examples present the material, making this a uniquely accessible introduction to quantum informatics. Topics covered include: * An introduction to quantum information and the qubit * Concepts and methods of quantum theory important for informatics * The application of information concepts to quantum physics * Quantum information processing and computing * Quantum gates * Error correction using quantum-based methods * Physical realizations of quantum computing circuits A helpful and economical resource for understanding this exciting new application of quantum theory to informatics, Quantum Approach to Informatics provides students and researchers in physics and information science, as well as other interested readers with some scientific background, with an essential overview of the field.
Computational chemistry and cheminformatics: an essay on the future.
Glen, Robert Charles
2012-01-01
Computers have changed the way we do science. Surrounded by a sea of data and with phenomenal computing capacity, the methodology and approach to scientific problems is evolving into a partnership between experiment, theory and data analysis. Given the pace of change of the last twenty-five years, it seems folly to speculate on the future, but along with unpredictable leaps of progress there will be a continuous evolution of capability, which points to opportunities and improvements that will certainly appear as our discipline matures.
European Science Notes Information Bulletin Reports on Current European/ Middle Eastern Science
1989-03-01
Palo-Oceanography, Marine Geophysics, Marine Environmental Geology, and Petrology of the Oceanic Crust. The spe- cific concerns of each of these...integration To compute numerically the expected value of an over the fermion fields, leaving an integral over the gauge operator, the configuration space...ethrough the machine (one space point per processor).In the gauge field theories of elementary particles, This is appropriate for generating gauge field
NASA Astrophysics Data System (ADS)
Andonov, Zdravko
This R&D represent innovative multidimensional 6D-N(6n)D Space-Time (S-T) Methodology, 6D-6nD Coordinate Systems, 6D Equations, new 6D strategy and technology for development of Planetary Space Sciences, S-T Data Management and S-T Computational To-mography. . . The Methodology is actual for brain new RS Microwaves' Satellites and Compu-tational Tomography Systems development, aimed to defense sustainable Earth, Moon, & Sun System evolution. Especially, extremely important are innovations for monitoring and protec-tion of strategic threelateral system H-OH-H2O Hydrogen, Hydroxyl and Water), correspond-ing to RS VHRS (Very High Resolution Systems) of 1.420-1.657-22.089GHz microwaves. . . One of the Greatest Paradox and Challenge of World Science is the "transformation" of J. L. Lagrange 4D Space-Time (S-T) System to H. Minkovski 4D S-T System (O-X,Y,Z,icT) for Einstein's "Theory of Relativity". As a global result: -In contemporary Advanced Space Sciences there is not real adequate 4D-6D Space-Time Coordinate System and 6D Advanced Cosmos Strategy & Methodology for Multidimensional and Multitemporal Space-Time Data Management and Tomography. . . That's one of the top actual S-T Problems. Simple and optimal nD S-T Methodology discovery is extremely important for all Universities' Space Sci-ences' Education Programs, for advances in space research and especially -for all young Space Scientists R&D!... The top ten 21-Century Challenges ahead of Planetary and Space Sciences, Space Data Management and Computational Space Tomography, important for successfully de-velopment of Young Scientist Generations, are following: 1. R&D of W. R. Hamilton General Idea for transformation all Space Sciences to Time Sciences, beginning with 6D Eukonal for 6D anisotropic mediums & velocities. Development of IERS Earth & Space Systems (VLBI; LLR; GPS; SLR; DORIS Etc.) for Planetary-Space Data Management & Computational Planetary & Space Tomography. 2. R&D of S. W. Hawking Paradigm for 2D Complex Time and Quan-tum Wave Cosmology Paradigm for Decision of the Main Problem of Contemporary Physics. 3. R&D of Einstein-Minkowski Geodesies' Paradigm in the 4D-Space-Time Continuum to 6D-6nD Space-Time Continuum Paradigms and 6D S-T Equations. . . 4. R&D of Erwin Schrüdinger 4D S-T Universe' Evolutional Equation; It's David Bohm 4D generalization for anisotropic mediums and innovative 6D -for instantaneously quantum measurement -Bohm-Schrüdinger 6D S-T Universe' Evolutional Equation. 5. R&D of brain new 6D Planning of S-T Experi-ments, brain new 6D Space Technicks and Space Technology Generalizations, especially for 6D RS VHRS Research, Monitoring and 6D Computational Tomography. 6. R&D of "6D Euler-Poisson Equations" and "6D Kolmogorov Turbulence Theory" for GeoDynamics and for Space Dynamics as evolution of Gauss-Riemann Paradigms. 7. R&D of N. Boneff NASA RD for Asteroid "Eros" & Space Science' Laws Evolution. 8. R&D of H. Poincare Paradigm for Nature and Cosmos as 6D Group of Transferences. 9. R&D of K. Popoff N-Body General Problem & General Thermodynamic S-T Theory as Einstein-Prigogine-Landau' Paradigms Development. ü 10. R&D of 1st GUT since 1958 by N. S. Kalitzin (Kalitzin N. S., 1958: Uber eine einheitliche Feldtheorie. ZAHeidelberg-ARI, WZHUmnR-B., 7 (2), 207-215) and "Multitemporal Theory of Relativity" -With special applications to Photon Rockets and all Space-Time R&D. GENERAL CONCLUSION: Multidimensional Space-Time Methodology is advance in space research, corresponding to the IAF-IAA-COSPAR Innovative Strategy and R&D Programs -UNEP, UNDP, GEOSS, GMES, Etc.
Sociocultural Influences On Undergraduate Women's Entry into a Computer Science Major
NASA Astrophysics Data System (ADS)
Lyon, Louise Ann
Computer science not only displays the pattern of underrepresentation of many other science, technology, engineering, and math (STEM) fields, but has actually experienced a decline in the number of women choosing the field over the past two decades. Broken out by gender and race, the picture becomes more nuanced, with the ratio of females to males receiving bachelor's degrees in computer science higher for non-White ethnic groups than for Whites. This dissertation explores the experiences of university women differing along the axis of race, class, and culture who are considering majoring in computer science in order to highlight how well-prepared women are persuaded that they belong (or not) in the field and how the confluence of social categories plays out in their decision. This study focuses on a university seminar entitled "Women in Computer Science and Engineering" open to women concurrently enrolled in introductory programming and uses an ethnographic approach including classroom participant observation, interviews with seminar students and instructors, observations of students in other classes, and interviews with parents of students. Three stand-alone but related articles explore various aspects of the experiences of women who participated in the study using Rom Harre's positioning theory as a theoretical framework. The first article uses data from twenty-two interviews to uncover how interactions with others and patterns in society position women in relation to a computer science major, and how these women have arrived at the point of considering the major despite messages that they do not belong. The second article more deeply explores the cases of three women who vary greatly along the axes of race, class, and culture in order to uncover pattern and interaction differences for women based on their ethnic background. The final article focuses on the attitudes and expectations of the mothers of three students of contrasting ethnicities and how reported interactions between mothers and daughters either constrain or afford opportunities for the daughters to choose a computer science major.
NASA Astrophysics Data System (ADS)
Eisenbach, Markus
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code. This work has been sponsored by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Material Sciences and Engineering Division and by the Office of Advanced Scientific Computing. This work used resources of the Oak Ridge Leadership Computing Facility, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.
Global Optimal Trajectory in Chaos and NP-Hardness
NASA Astrophysics Data System (ADS)
Latorre, Vittorio; Gao, David Yang
This paper presents an unconventional theory and method for solving general nonlinear dynamical systems. Instead of the direct iterative methods, the discretized nonlinear system is first formulated as a global optimization problem via the least squares method. A newly developed canonical duality theory shows that this nonconvex minimization problem can be solved deterministically in polynomial time if a global optimality condition is satisfied. The so-called pseudo-chaos produced by linear iterative methods are mainly due to the intrinsic numerical error accumulations. Otherwise, the global optimization problem could be NP-hard and the nonlinear system can be really chaotic. A conjecture is proposed, which reveals the connection between chaos in nonlinear dynamics and NP-hardness in computer science. The methodology and the conjecture are verified by applications to the well-known logistic equation, a forced memristive circuit and the Lorenz system. Computational results show that the canonical duality theory can be used to identify chaotic systems and to obtain realistic global optimal solutions in nonlinear dynamical systems. The method and results presented in this paper should bring some new insights into nonlinear dynamical systems and NP-hardness in computational complexity theory.
Shen, Weifeng; Jiang, Libing; Zhang, Mao; Ma, Yuefeng; Jiang, Guanyu; He, Xiaojun
2014-01-01
To review the research methods of mass casualty incident (MCI) systematically and introduce the concept and characteristics of complexity science and artificial system, computational experiments and parallel execution (ACP) method. We searched PubMed, Web of Knowledge, China Wanfang and China Biology Medicine (CBM) databases for relevant studies. Searches were performed without year or language restrictions and used the combinations of the following key words: "mass casualty incident", "MCI", "research method", "complexity science", "ACP", "approach", "science", "model", "system" and "response". Articles were searched using the above keywords and only those involving the research methods of mass casualty incident (MCI) were enrolled. Research methods of MCI have increased markedly over the past few decades. For now, dominating research methods of MCI are theory-based approach, empirical approach, evidence-based science, mathematical modeling and computer simulation, simulation experiment, experimental methods, scenario approach and complexity science. This article provides an overview of the development of research methodology for MCI. The progresses of routine research approaches and complexity science are briefly presented in this paper. Furthermore, the authors conclude that the reductionism underlying the exact science is not suitable for MCI complex systems. And the only feasible alternative is complexity science. Finally, this summary is followed by a review that ACP method combining artificial systems, computational experiments and parallel execution provides a new idea to address researches for complex MCI.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy, Santanu; Baer, Marcel D.; Mundy, Christopher J.
We present a theory for ion pair dissociation and association, motivated by the concepts of the Marcus theory of electron transfer. Despite the extensive research on ion-pairing in many chemical and biological processes, much can be learned from the exploration of collective reaction coordinates. To this end, we explore two reaction coordinates, ion pair distance and coordination number. The study of the correlation between these reaction coordinates provides a new insight into the mechanism and kinetics of ion pair dissociation and association in water. The potential of mean force on these 2D-surfaces computed from molecular dynamics simulations of different monovalentmore » ion pairs reveal a Marcus-like mechanism for ion-pairing: Water molecules rearrange forming an activated coordination state prior to ion pair dissociation or association, followed by relaxation of the coordination state due to further water rearrangement. Like Marcus theory, we find the existence of an inverted region where the transition rates are slower with increasing exergonicity. This study provides a new perspective for the future investigations of ion-pairing and transport. SR, CJM, and GKS were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). The research was performed using PNNL Institutional Computing. PNNL is a multi-program national laboratory operated by Battelle for the U.S. Department of Energy.« less
The physics of teams: Interdependence, measurable entropy and computational emotion
NASA Astrophysics Data System (ADS)
Lawless, William F.
2017-08-01
Most of the social sciences, including psychology, economics and subjective social network theory, are modeled on the individual, leaving the field not only a-theoretical, but also inapplicable to a physics of hybrid teams, where hybrid refers to arbitrarily combining humans, machines and robots into a team to perform a dedicated mission (e.g., military, business, entertainment) or to solve a targeted problem (e.g., with scientists, engineers, entrepreneurs). As a common social science practice, the ingredient at the heart of the social interaction, interdependence, is statistically removed prior to the replication of social experiments; but, as an analogy, statistically removing social interdependence to better study the individual is like statistically removing quantum effects as a complication to the study of the atom. Further, in applications of Shannon’s information theory to teams, the effects of interdependence are minimized, but even there, interdependence is how classical information is transmitted. Consequently, numerous mistakes are made when applying non-interdependent models to policies, the law and regulations, impeding social welfare by failing to exploit the power of social interdependence. For example, adding redundancy to human teams is thought by subjective social network theorists to improve the efficiency of a network, easily contradicted by our finding that redundancy is strongly associated with corruption in non-free markets. Thus, built atop the individual, most of the social sciences, economics and social network theory have little if anything to contribute to the engineering of hybrid teams. In defense of the social sciences, the mathematical physics of interdependence is elusive, non-intuitive and non-rational. However, by replacing determinism with bistable states, interdependence at the social level mirrors entanglement at the quantum level, suggesting the applicability of quantum tools for social science. We report how our quantum-like models capture some of the essential aspects of interdependence, a tool for the metrics of hybrid teams; as an example, we find additional support for our model of the solution to the open problem of team size. We also report on progress with the theory of computational emotion for hybrid teams, linking it qualitatively to the second law of thermodynamics. We conclude that the science of interdependence
NASA Astrophysics Data System (ADS)
Chang, Li-Na; Luo, Shun-Long; Sun, Yuan
2017-11-01
The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182
1989-11-01
9) Are you interested in dreams in art or poetry, or psychoanalysis or dreams , or what? -o- (10) U: Give me theories about dream (ie where do they... dream ? Give a multidisciplinary historical account. You might want to include theories and explanations from religion, psychology. neurosciencc, and...computer science. You paper should be approximately thirty pages. 0: May I help you? -o- (1) U: Yes, what do you know about dreams ? -0- (2) 0: Well, do you
Understanding Islamist political violence through computational social simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less
Stretching the Traditional Notion of Experiment in Computing: Explorative Experiments.
Schiaffonati, Viola
2016-06-01
Experimentation represents today a 'hot' topic in computing. If experiments made with the support of computers, such as computer simulations, have received increasing attention from philosophers of science and technology, questions such as "what does it mean to do experiments in computer science and engineering and what are their benefits?" emerged only recently as central in the debate over the disciplinary status of the discipline. In this work we aim at showing, also by means of paradigmatic examples, how the traditional notion of controlled experiment should be revised to take into account a part of the experimental practice in computing along the lines of experimentation as exploration. Taking inspiration from the discussion on exploratory experimentation in the philosophy of science-experimentation that is not theory-driven-we advance the idea of explorative experiments that, although not new, can contribute to enlarge the debate about the nature and role of experimental methods in computing. In order to further refine this concept we recast explorative experiments as socio-technical experiments, that test new technologies in their socio-technical contexts. We suggest that, when experiments are explorative, control should be intended in a posteriori form, in opposition to the a priori form that usually takes place in traditional experimental contexts.
NASA Technical Reports Server (NTRS)
Wu, Chung-Hua
1993-01-01
This report represents a general theory applicable to axial, radial, and mixed flow turbomachines operating at subsonic and supersonic speeds with a finite number of blades of finite thickness. References reflect the evolution of computational methods used, from the inception of the theory in the 50's to the high-speed computer era of the 90's. Two kinds of relative stream surfaces, S(sub 1) and S(sub 2), are introduced for the purpose of obtaining a three-dimensional flow solution through the combination of two-dimensional flow solutions. Nonorthogonal curvilinear coordinates are used for the governing equations. Methods of computing transonic flow along S(sub 1) and S(sub 2) stream surfaces are given for special cases as well as for fully three-dimensional transonic flows. Procedures pertaining to the direct solutions and inverse solutions are presented. Information on shock wave locations and shapes needed for computations are discussed. Experimental data from a Deutsche Forschungs- und Versuchsanstalt fur Luft- und Raumfahrt e.V. (DFVLR) rotor and from a Chinese Academy of Sciences (CAS) transonic compressor rotor are compared with the computed flow properties.
3D animation model with augmented reality for natural science learning in elementary school
NASA Astrophysics Data System (ADS)
Hendajani, F.; Hakim, A.; Lusita, M. D.; Saputra, G. E.; Ramadhana, A. P.
2018-05-01
Many opinions from primary school students' on Natural Science are a difficult lesson. Many subjects are not easily understood by students, especially on materials that teach some theories about natural processes. Such as rain process, condensation and many other processes. The difficulty that students experience in understanding it is that students cannot imagine the things that have been taught in the material. Although there is material to practice some theories but is actually quite limited. There is also a video or simulation material in the form of 2D animated images. Understanding concepts in natural science lessons are also poorly understood by students. Natural Science learning media uses 3-dimensional animation models (3D) with augmented reality technology, which offers some visualization of science lessons. This application was created to visualize a process in Natural Science subject matter. The hope of making this application is to improve student's concept. This app is made to run on a personal computer that comes with a webcam with augmented reality. The app will display a 3D animation if the camera can recognize the marker.
Identifying economics' place amongst academic disciplines: a science or a social science?
Hudson, John
2017-01-01
Different academic disciplines exhibit different styles, including styles in journal titles. Using data from the 2014 Research Excellence Framework (REF) in the UK we are able to identify the stylistic trends of different disciplines using 155,552 journal titles across all disciplines. Cluster analysis is then used to group the different disciplines together. The resulting identification fits the social sciences, the sciences and the arts and humanities reasonably well. Economics overall, fits best with philosophy, but the linkage is weak. When we divided economics into papers published in theory, econometrics and the remaining journals, the first two link with mathematics and computer science, particularly econometrics, and thence the sciences. The rest of economics then links with business and thence the social sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattsson, Ann E.
Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Hack, James; Riley, Katherine
The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less
Brønsted acidity of protic ionic liquids: a modern ab initio valence bond theory perspective.
Patil, Amol Baliram; Mahadeo Bhanage, Bhalchandra
2016-09-21
Room temperature ionic liquids (ILs), especially protic ionic liquids (PILs), are used in many areas of the chemical sciences. Ionicity, the extent of proton transfer, is a key parameter which determines many physicochemical properties and in turn the suitability of PILs for various applications. The spectrum of computational chemistry techniques applied to investigate ionic liquids includes classical molecular dynamics, Monte Carlo simulations, ab initio molecular dynamics, Density Functional Theory (DFT), CCSD(t) etc. At the other end of the spectrum is another computational approach: modern ab initio Valence Bond Theory (VBT). VBT differs from molecular orbital theory based methods in the expression of the molecular wave function. The molecular wave function in the valence bond ansatz is expressed as a linear combination of valence bond structures. These structures include covalent and ionic structures explicitly. Modern ab initio valence bond theory calculations of representative primary and tertiary ammonium protic ionic liquids indicate that modern ab initio valence bond theory can be employed to assess the acidity and ionicity of protic ionic liquids a priori.
Density functional theory in the solid state
Hasnip, Philip J.; Refson, Keith; Probert, Matt I. J.; Yates, Jonathan R.; Clark, Stewart J.; Pickard, Chris J.
2014-01-01
Density functional theory (DFT) has been used in many fields of the physical sciences, but none so successfully as in the solid state. From its origins in condensed matter physics, it has expanded into materials science, high-pressure physics and mineralogy, solid-state chemistry and more, powering entire computational subdisciplines. Modern DFT simulation codes can calculate a vast range of structural, chemical, optical, spectroscopic, elastic, vibrational and thermodynamic phenomena. The ability to predict structure–property relationships has revolutionized experimental fields, such as vibrational and solid-state NMR spectroscopy, where it is the primary method to analyse and interpret experimental spectra. In semiconductor physics, great progress has been made in the electronic structure of bulk and defect states despite the severe challenges presented by the description of excited states. Studies are no longer restricted to known crystallographic structures. DFT is increasingly used as an exploratory tool for materials discovery and computational experiments, culminating in ex nihilo crystal structure prediction, which addresses the long-standing difficult problem of how to predict crystal structure polymorphs from nothing but a specified chemical composition. We present an overview of the capabilities of solid-state DFT simulations in all of these topics, illustrated with recent examples using the CASTEP computer program. PMID:24516184
Expertise transfer for expert system design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boose, J.H.
This book is about the Expertise Transfer System-a computer program which interviews experts and helps them build expert systems, i.e. computer programs that use knowledge from experts to make decisions and judgements under conditions of uncertainty. The techniques are useful to anyone who uses decision-making information based on the expertise of others. The methods can also be applied to personal decision-making. The interviewing methodology is borrowed from a branch of psychology called Personal Construct Theory. It is not necessary to use a computer to take advantage of the techniques from Personal Construction Theory; the fundamental procedures used by the Expertisemore » Transfer System can be performed using paper and pencil. It is not necessary that the reader understand very much about computers to understand the ideas in this book. The few relevant concepts from computer science and expert systems that are needed are explained in a straightforward manner. Ideas from Personal Construct Psychology are also introduced as needed.« less
Gordon, M. J. C.
2015-01-01
Robin Milner's paper, ‘The use of machines to assist in rigorous proof’, introduces methods for automating mathematical reasoning that are a milestone in the development of computer-assisted theorem proving. His ideas, particularly his theory of tactics, revolutionized the architecture of proof assistants. His methodology for automating rigorous proof soundly, particularly his theory of type polymorphism in programing, led to major contributions to the theory and design of programing languages. His citation for the 1991 ACM A.M. Turing award, the most prestigious award in computer science, credits him with, among other achievements, ‘probably the first theoretically based yet practical tool for machine assisted proof construction’. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750147
Knowledge Discovery from Climate Data using Graph-Based Methods
NASA Astrophysics Data System (ADS)
Steinhaeuser, K.
2012-04-01
Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.
Center for Interface Science and Catalysis | Theory
& Stanford School of Engineering Toggle navigation Home Research Publications People About Academic to overcome challenges associated with the atomic-scale design of catalysts for chemical computational methods we are developing a quantitative description of chemical processes at the solid-gas and
Culture and Cognition in Information Technology Education
ERIC Educational Resources Information Center
Holvikivi, Jaana
2007-01-01
This paper aims at explaining the outcomes of information technology education for international students using anthropological theories of cultural schemas. Even though computer science and engineering are usually assumed to be culture-independent, the practice in classrooms seems to indicate that learning patterns depend on culture. The…
Lattice QCD Application Development within the US DOE Exascale Computing Project
NASA Astrophysics Data System (ADS)
Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul
2018-03-01
In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.
Legendre modified moments for Euler's constant
NASA Astrophysics Data System (ADS)
Prévost, Marc
2008-10-01
Polynomial moments are often used for the computation of Gauss quadrature to stabilize the numerical calculation of the orthogonal polynomials, see [W. Gautschi, Computational aspects of orthogonal polynomials, in: P. Nevai (Ed.), Orthogonal Polynomials-Theory and Practice, NATO ASI Series, Series C: Mathematical and Physical Sciences, vol. 294. Kluwer, Dordrecht, 1990, pp. 181-216 [6]; W. Gautschi, On the sensitivity of orthogonal polynomials to perturbations in the moments, Numer. Math. 48(4) (1986) 369-382 [5]; W. Gautschi, On generating orthogonal polynomials, SIAM J. Sci. Statist. Comput. 3(3) (1982) 289-317 [4
Lattice QCD Application Development within the US DOE Exascale Computing Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brower, Richard; Christ, Norman; DeTar, Carleton
In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.
Materials inspired by mathematics.
Kotani, Motoko; Ikeda, Susumu
2016-01-01
Our world is transforming into an interacting system of the physical world and the digital world. What will be the materials science in the new era? With the rising expectations of the rapid development of computers, information science and mathematical science including statistics and probability theory, 'data-driven materials design' has become a common term. There is knowledge and experience gained in the physical world in the form of know-how and recipes for the creation of material. An important key is how we establish vocabulary and grammar to translate them into the language of the digital world. In this article, we outline how materials science develops when it encounters mathematics, showing some emerging directions.
Data Understanding Applied to Optimization
NASA Technical Reports Server (NTRS)
Buntine, Wray; Shilman, Michael
1998-01-01
The goal of this research is to explore and develop software for supporting visualization and data analysis of search and optimization. Optimization is an ever-present problem in science. The theory of NP-completeness implies that the problems can only be resolved by increasingly smarter problem specific knowledge, possibly for use in some general purpose algorithms. Visualization and data analysis offers an opportunity to accelerate our understanding of key computational bottlenecks in optimization and to automatically tune aspects of the computation for specific problems. We will prototype systems to demonstrate how data understanding can be successfully applied to problems characteristic of NASA's key science optimization tasks, such as central tasks for parallel processing, spacecraft scheduling, and data transmission from a remote satellite.
The Complexity of Primary Care Psychology: Theoretical Foundations.
Smit, E H; Derksen, J J L
2015-07-01
How does primary care psychology deal with organized complexity? Has it escaped Newtonian science? Has it, as Weaver (1991) suggests, found a way to 'manage problems with many interrelated factors that cannot be dealt by statistical techniques'? Computer simulations and mathematical models in psychology are ongoing positive developments in the study of complex systems. However, the theoretical development of complex systems in psychology lags behind these advances. In this article we use complexity science to develop a theory on experienced complexity in the daily practice of primary care psychologists. We briefly answer the ontological question of what we see (from the perspective of primary care psychology) as reality, the epistemological question of what we can know, the methodological question of how to act, and the ethical question of what is good care. Following our empirical study, we conclude that complexity science can describe the experienced complexity of the psychologist and offer room for personalized client-centered care. Complexity science is slowly filling the gap between the dominant reductionist theory and complex daily practice.
Role of theory in space science
NASA Technical Reports Server (NTRS)
1983-01-01
The goal of theory is to understand how the fundamental laws of physics laws of physics and chemistry give rise to the features of the universe. It is recommended that NASA establish independent theoretical research programs in planetary sciences and in astrophysics similar to the solar-system plasma-physics theory program, which is characterized by stable, long-term support for theorists in university departments, NASA centers, and other organizations engaged in research in topics relevant to present and future space-derived data. It is recommended that NASA keep these programs under review to full benefit from the resulting research and to assure opportunities for inflow of new ideas and investigators. Also, provisions should be made by NASA for the computing needs of the theorists in the programs. Finally, it is recommended that NASA involve knowledgeable theorists in mission planning activities at all levels, from the formulation of long-term scientific strategies through the planning and operation of specific missions.
NASA Astrophysics Data System (ADS)
Gunceler, Deniz
Solvents are of great importance in many technological applications, but are difficult to study using standard, off-the-shelf ab initio electronic structure methods. This is because a single configuration of molecular positions in the solvent (a "snapshot" of the fluid) is not necessarily representative of the thermodynamic average. To obtain any thermodynamic averages (e.g. free energies), the phase space of the solvent must be sampled, typically using molecular dynamics. This greatly increases the computational cost involved in studying solvated systems. Joint density-functional theory has made its mark by being a computationally efficient yet rigorous theory by which to study solvation. It replaces the need for thermodynamic sampling with an effective continuum description of the solvent environment that is in-principle exact, computationally efficient and intuitive (easier to interpret). It has been very successful in aqueous systems, with potential applications in (among others) energy materials discovery, catalysis and surface science. In this dissertation, we develop accurate and fast joint density functional theories for complex, non-aqueous solvent enviroments, including organic solvents and room temperature ionic liquids, as well as new methods for calculating electron excitation spectra in such systems. These theories are then applied to a range of physical problems, from dendrite formation in lithium-metal batteries to the optical spectra of solvated ions.
First-principles data-driven discovery of transition metal oxides for artificial photosynthesis
NASA Astrophysics Data System (ADS)
Yan, Qimin
We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.
Maximum Likelihood Estimation of Nonlinear Structural Equation Models with Ignorable Missing Data
ERIC Educational Resources Information Center
Lee, Sik-Yum; Song, Xin-Yuan; Lee, John C. K.
2003-01-01
The existing maximum likelihood theory and its computer software in structural equation modeling are established on the basis of linear relationships among latent variables with fully observed data. However, in social and behavioral sciences, nonlinear relationships among the latent variables are important for establishing more meaningful models…
Electronic Structure Theory | Materials Science | NREL
design and discover materials for energy applications. This includes detailed studies of the physical computing. Key Research Areas Materials by Design NREL leads the U.S. Department of Energy's Center for Next Generation of Materials by Design, which incorporates metastability and synthesizability. Learn more about
Factor Structure and Reliability of Test Items for Saudi Teacher Licence Assessment
ERIC Educational Resources Information Center
Alsadaawi, Abdullah Saleh
2017-01-01
The Saudi National Assessment Centre administers the Computer Science Teacher Test for teacher certification. The aim of this study is to explore gender differences in candidates' scores, and investigate dimensionality, reliability, and differential item functioning using confirmatory factor analysis and item response theory. The confirmatory…
A Wittgenstein Approach to the Learning of OO-Modeling
ERIC Educational Resources Information Center
Holmboe, Christian
2004-01-01
The paper uses Ludwig Wittgenstein's theories about the relationship between thought, language, and objects of the world to explore the assumption that OO-thinking resembles natural thinking. The paper imports from research in linguistic philosophy to computer science education research. I show how UML class diagrams (i.e., an artificial…
Constructive Models of Discrete and Continuous Physical Phenomena
2014-02-08
BOURKE , T., CAILLAUD, B., AND POUZET, M. The fundamentals of hybrid systems modelers. Journal of Computer and System Sciences 78, 3 (2012), 877–910...8. BENVENISTE, A., BOURKE , T., CAILLAUD, B., AND POUZET, M. Index theory for hy- brid DAE systems (abstract and slides). In Synchronous Programming
An Interactive Learning Environment for Information and Communication Theory
ERIC Educational Resources Information Center
Hamada, Mohamed; Hassan, Mohammed
2017-01-01
Interactive learning tools are emerging as effective educational materials in the area of computer science and engineering. It is a research domain that is rapidly expanding because of its positive impacts on motivating and improving students' performance during the learning process. This paper introduces an interactive learning environment for…
The Application of Peer Teaching in Digital Forensics Education
ERIC Educational Resources Information Center
Govan, Michelle
2016-01-01
The field of digital forensics requires a multidisciplinary understanding of a range of diverse subjects, but is interdisciplinary (in using principles, techniques and theories from other disciplines) encompassing both computer and forensic science. This requires that practitioners have a deep technical knowledge and understanding, but that they…
Concept Learning and Heuristic Classification in Weak-Theory Domains
1990-03-01
age and noise-induced cochlear age..gt.60 noise-induced cochlear air(mild) age-induced cochlear history(noise) norma ]_ear speechpoor)acousticneuroma...Annual review of computer science. Machine Learning, 4, 1990. (to appear). [18] R.T. Duran . Concept learning with incomplete data sets. Master’s thesis
Rethinking Technology-Enhanced Physics Teacher Education: From Theory to Practice
ERIC Educational Resources Information Center
Milner-Bolotin, Marina
2016-01-01
This article discusses how modern technology, such as electronic response systems, PeerWise system, data collection and analysis tools, computer simulations, and modeling software can be used in physics methods courses to promote teacher-candidates' professional competencies and their positive attitudes about mathematics and science education. We…
Fermilab | Science at Fermilab | Experiments & Projects | Cosmic Frontier
Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Medicine Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library
Theoretical Branches in Teaching Computer Science
ERIC Educational Resources Information Center
Habiballa, Hashim; Kmet, Tibor
2004-01-01
The present paper describes an educational experiment dealing with teaching the theory of formal languages and automata as well as their application concepts. It presents a practical application of an educational experiment and initial results based on comparative instruction of two samples of students (n = 56). The application concept should…
Fourier Transforms Simplified: Computing an Infrared Spectrum from an Interferogram
ERIC Educational Resources Information Center
Hanley, Quentin S.
2012-01-01
Fourier transforms are used widely in chemistry and allied sciences. Examples include infrared, nuclear magnetic resonance, and mass spectroscopies. A thorough understanding of Fourier methods assists the understanding of microscopy, X-ray diffraction, and diffraction gratings. The theory of Fourier transforms has been presented in this "Journal",…
The Role of Working Memory in Metaphor Production and Comprehension
ERIC Educational Resources Information Center
Chiappe, Dan L.; Chiappe, Penny
2007-01-01
The following tested Kintsch's [Kintsch, W. (2000). "Metaphor comprehension: a computational theory." "Psychonomic Bulletin & Review," 7, 257-266 and Kintsch, W. (2001). "Predication." "Cognitive Science," 25, 173-202] Predication Model, which predicts that working memory capacity is an important factor in metaphor processing. In support of his…
To the Cloud! A Grassroots Proposal to Accelerate Brain Science Discovery
Vogelstein, Joshua T.; Mensh, Brett; Hausser, Michael; Spruston, Nelson; Evans, Alan; Kording, Konrad; Amunts, Katrin; Ebell, Christoph; Muller, Jeff; Telefont, Martin; Hill, Sean; Koushika, Sandhya P.; Cali, Corrado; Valdés-Sosa, Pedro Antonio; Littlewood, Peter; Koch, Christof; Saalfeld, Stephan; Kepecs, Adam; Peng, Hanchuan; Halchenko, Yaroslav O.; Kiar, Gregory; Poo, Mu-Ming; Poline, Jean-Baptiste; Milham, Michael P.; Schaffer, Alyssa Picchini; Gidron, Rafi; Okano, Hideyuki; Calhoun, Vince D; Chun, Miyoung; Kleissas, Dean M.; Vogelstein, R. Jacob; Perlman, Eric; Burns, Randal; Huganir, Richard; Miller, Michael I.
2018-01-01
The revolution in neuroscientific data acquisition is creating an analysis challenge. We propose leveraging cloud-computing technologies to enable large-scale neurodata storing, exploring, analyzing, and modeling. This utility will empower scientists globally to generate and test theories of brain function and dysfunction. PMID:27810005
Effects of Using Historical Microworlds on Conceptual Change: A P-Prim Analysis
ERIC Educational Resources Information Center
Masson, Steve; Legendre, Marie-Francoise
2008-01-01
This study examines the effects of using historical microworlds on conceptual change in mechanics. Historical microworlds combine history of science and microworld through a computer based interactive learning environment that respects and represents historic conceptions or theories. Six grade 5 elementary students participated individually to…
Design and Performance Frameworks for Constructing Problem-Solving Simulations
ERIC Educational Resources Information Center
Stevens, Rons; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks…
Measuring Graph Comprehension, Critique, and Construction in Science
NASA Astrophysics Data System (ADS)
Lai, Kevin; Cabrera, Julio; Vitale, Jonathan M.; Madhok, Jacquie; Tinker, Robert; Linn, Marcia C.
2016-08-01
Interpreting and creating graphs plays a critical role in scientific practice. The K-12 Next Generation Science Standards call for students to use graphs for scientific modeling, reasoning, and communication. To measure progress on this dimension, we need valid and reliable measures of graph understanding in science. In this research, we designed items to measure graph comprehension, critique, and construction and developed scoring rubrics based on the knowledge integration (KI) framework. We administered the items to over 460 middle school students. We found that the items formed a coherent scale and had good reliability using both item response theory and classical test theory. The KI scoring rubric showed that most students had difficulty linking graphs features to science concepts, especially when asked to critique or construct graphs. In addition, students with limited access to computers as well as those who speak a language other than English at home have less integrated understanding than others. These findings point to the need to increase the integration of graphing into science instruction. The results suggest directions for further research leading to comprehensive assessments of graph understanding.
A new perspective on the perceptual selectivity of attention under load.
Giesbrecht, Barry; Sy, Jocelyn; Bundesen, Claus; Kyllingsbaek, Søren
2014-05-01
The human attention system helps us cope with a complex environment by supporting the selective processing of information relevant to our current goals. Understanding the perceptual, cognitive, and neural mechanisms that mediate selective attention is a core issue in cognitive neuroscience. One prominent model of selective attention, known as load theory, offers an account of how task demands determine when information is selected and an account of the efficiency of the selection process. However, load theory has several critical weaknesses that suggest that it is time for a new perspective. Here we review the strengths and weaknesses of load theory and offer an alternative biologically plausible computational account that is based on the neural theory of visual attention. We argue that this new perspective provides a detailed computational account of how bottom-up and top-down information is integrated to provide efficient attentional selection and allocation of perceptual processing resources. © 2014 New York Academy of Sciences.
Why cognitive science needs philosophy and vice versa.
Thagard, Paul
2009-04-01
Contrary to common views that philosophy is extraneous to cognitive science, this paper argues that philosophy has a crucial role to play in cognitive science with respect to generality and normativity. General questions include the nature of theories and explanations, the role of computer simulation in cognitive theorizing, and the relations among the different fields of cognitive science. Normative questions include whether human thinking should be Bayesian, whether decision making should maximize expected utility, and how norms should be established. These kinds of general and normative questions make philosophical reflection an important part of progress in cognitive science. Philosophy operates best, however, not with a priori reasoning or conceptual analysis, but rather with empirically informed reflection on a wide range of findings in cognitive science. Copyright © 2009 Cognitive Science Society, Inc.
Sculpting Computational-Level Models.
Blokpoel, Mark
2017-06-27
In this commentary, I advocate for strict relations between Marr's levels of analysis. Under a strict relationship, each level is exactly implemented by the subordinate level. This yields two benefits. First, it brings consistency for multilevel explanations. Second, similar to how a sculptor chisels away superfluous marble, a modeler can chisel a computational-level model by applying constraints. By sculpting the model, one restricts the (potentially infinitely large) set of possible algorithmic- and implementational-level theories. Copyright © 2017 Cognitive Science Society, Inc.
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
Robust flow stability: Theory, computations and experiments in near wall turbulence
NASA Astrophysics Data System (ADS)
Bobba, Kumar Manoj
Helmholtz established the field of hydrodynamic stability with his pioneering work in 1868. From then on, hydrodynamic stability became an important tool in understanding various fundamental fluid flow phenomena in engineering (mechanical, aeronautics, chemical, materials, civil, etc.) and science (astrophysics, geophysics, biophysics, etc.), and turbulence in particular. However, there are many discrepancies between classical hydrodynamic stability theory and experiments. In this thesis, the limitations of traditional hydrodynamic stability theory are shown and a framework for robust flow stability theory is formulated. A host of new techniques like gramians, singular values, operator norms, etc. are introduced to understand the role of various kinds of uncertainty. An interesting feature of this framework is the close interplay between theory and computations. It is shown that a subset of Navier-Stokes equations are globally, non-nonlinearly stable for all Reynolds number. Yet, invoking this new theory, it is shown that these equations produce structures (vortices and streaks) as seen in the experiments. The experiments are done in zero pressure gradient transiting boundary layer on a flat plate in free surface tunnel. Digital particle image velocimetry, and MEMS based laser Doppler velocimeter and shear stress sensors have been used to make quantitative measurements of the flow. Various theoretical and computational predictions are in excellent agreement with the experimental data. A closely related topic of modeling, simulation and complexity reduction of large mechanics problems with multiple spatial and temporal scales is also studied. A nice method that rigorously quantifies the important scales and automatically gives models of the problem to various levels of accuracy is introduced. Computations done using spectral methods are presented.
Wichchukit, Sukanya; O'Mahony, Michael
2010-01-01
This article reviews a beneficial effect of technology transfer from Electrical Engineering to Food Sensory Science. Specifically, it reviews the recent adoption in Food Sensory Science of the receiver operating characteristic (ROC) curve, a tool that is incorporated in the theory of signal detection. Its use allows the information processing that takes place in the brain during sensory difference testing to be studied and understood. The review deals with how Signal Detection Theory, also called Thurstonian modeling, led to the adoption of a more sophisticated way of analyzing the data from sensory difference tests, by introducing the signal-to-noise ratio, d', as a fundamental measure of perceived small sensory differences. Generally, the method of computation of d' is a simple matter for some of the better known difference tests like the triangle, duo-trio and 2-AFC. However, there are occasions when these tests are not appropriate and other tests like the same-different and the A Not-A test are more suitable. Yet, for these, it is necessary to understand how the brain processes information during the test before d' can be computed. It is for this task that the ROC curve has a particular use. © 2010 Institute of Food Technologists®
Marek, A; Blum, V; Johanni, R; Havu, V; Lang, B; Auckenthaler, T; Heinecke, A; Bungartz, H-J; Lederer, H
2014-05-28
Obtaining the eigenvalues and eigenvectors of large matrices is a key problem in electronic structure theory and many other areas of computational science. The computational effort formally scales as O(N(3)) with the size of the investigated problem, N (e.g. the electron count in electronic structure theory), and thus often defines the system size limit that practical calculations cannot overcome. In many cases, more than just a small fraction of the possible eigenvalue/eigenvector pairs is needed, so that iterative solution strategies that focus only on a few eigenvalues become ineffective. Likewise, it is not always desirable or practical to circumvent the eigenvalue solution entirely. We here review some current developments regarding dense eigenvalue solvers and then focus on the Eigenvalue soLvers for Petascale Applications (ELPA) library, which facilitates the efficient algebraic solution of symmetric and Hermitian eigenvalue problems for dense matrices that have real-valued and complex-valued matrix entries, respectively, on parallel computer platforms. ELPA addresses standard as well as generalized eigenvalue problems, relying on the well documented matrix layout of the Scalable Linear Algebra PACKage (ScaLAPACK) library but replacing all actual parallel solution steps with subroutines of its own. For these steps, ELPA significantly outperforms the corresponding ScaLAPACK routines and proprietary libraries that implement the ScaLAPACK interface (e.g. Intel's MKL). The most time-critical step is the reduction of the matrix to tridiagonal form and the corresponding backtransformation of the eigenvectors. ELPA offers both a one-step tridiagonalization (successive Householder transformations) and a two-step transformation that is more efficient especially towards larger matrices and larger numbers of CPU cores. ELPA is based on the MPI standard, with an early hybrid MPI-OpenMPI implementation available as well. Scalability beyond 10,000 CPU cores for problem sizes arising in the field of electronic structure theory is demonstrated for current high-performance computer architectures such as Cray or Intel/Infiniband. For a matrix of dimension 260,000, scalability up to 295,000 CPU cores has been shown on BlueGene/P.
Computational rationality: linking mechanism and behavior through bounded utility maximization.
Lewis, Richard L; Howes, Andrew; Singh, Satinder
2014-04-01
We propose a framework for including information-processing bounds in rational analyses. It is an application of bounded optimality (Russell & Subramanian, 1995) to the challenges of developing theories of mechanism and behavior. The framework is based on the idea that behaviors are generated by cognitive mechanisms that are adapted to the structure of not only the environment but also the mind and brain itself. We call the framework computational rationality to emphasize the incorporation of computational mechanism into the definition of rational action. Theories are specified as optimal program problems, defined by an adaptation environment, a bounded machine, and a utility function. Such theories yield different classes of explanation, depending on the extent to which they emphasize adaptation to bounds, and adaptation to some ecology that differs from the immediate local environment. We illustrate this variation with examples from three domains: visual attention in a linguistic task, manual response ordering, and reasoning. We explore the relation of this framework to existing "levels" approaches to explanation, and to other optimality-based modeling approaches. Copyright © 2014 Cognitive Science Society, Inc.
[Forensic evidence-based medicine in computer communication networks].
Qiu, Yun-Liang; Peng, Ming-Qi
2013-12-01
As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.
Research briefing on contemporary problems in plasma science
NASA Technical Reports Server (NTRS)
1991-01-01
An overview is presented of the broad perspective of all plasma science. Detailed discussions are given of scientific opportunities in various subdisciplines of plasma science. The first subdiscipline to be discussed is the area where the contemporary applications of plasma science are the most widespread, low temperature plasma science. Opportunities for new research and technology development that have emerged as byproducts of research in magnetic and inertial fusion are then highlighted. Then follows a discussion of new opportunities in ultrafast plasma science opened up by recent developments in laser and particle beam technology. Next, research that uses smaller scale facilities is discussed, first discussing non-neutral plasmas, and then the area of basic plasma experiments. Discussions of analytic theory and computational plasma physics and of space and astrophysical plasma physics are then presented.
Materials inspired by mathematics
Kotani, Motoko; Ikeda, Susumu
2016-01-01
Abstract Our world is transforming into an interacting system of the physical world and the digital world. What will be the materials science in the new era? With the rising expectations of the rapid development of computers, information science and mathematical science including statistics and probability theory, ‘data-driven materials design’ has become a common term. There is knowledge and experience gained in the physical world in the form of know-how and recipes for the creation of material. An important key is how we establish vocabulary and grammar to translate them into the language of the digital world. In this article, we outline how materials science develops when it encounters mathematics, showing some emerging directions. PMID:27877877
Coalescent: an open-science framework for importance sampling in coalescent theory.
Tewari, Susanta; Spouge, John L
2015-01-01
Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only effective sample size. Here, we evaluate proposals in the coalescent literature, to discover that the order of efficiency among the three importance sampling schemes changes when one considers running time as well as effective sample size. We also describe a computational technique called "just-in-time delegation" available to improve the trade-off between running time and precision by constructing improved importance sampling schemes from existing ones. Thus, our systems approach is a potential solution to the "2(8) programs problem" highlighted by Felsenstein, because it provides the flexibility to include or exclude various features of similar coalescent models or importance sampling schemes.
Soil Heat Flow. Physical Processes in Terrestrial and Aquatic Ecosystems, Transport Processes.
ERIC Educational Resources Information Center
Simpson, James R.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. Soil heat flow and the resulting soil temperature distributions have ecological consequences…
ERIC Educational Resources Information Center
Charnitski, Christina Wotell; Harvey, Francis A.
This paper presents the theories of L.S. Vygotsky as a conceptual framework for implementing instruction that supports concept development and promotes higher level thinking skills in students. Three major components (i.e., language, scientific and spontaneous concepts, and the zone of proximal development) of Vygotsky's socio-cultural-historical…
Reality Is Our Laboratory: Communities of Practice in Applied Computer Science
ERIC Educational Resources Information Center
Rohde, M.; Klamma, R.; Jarke, M.; Wulf, V.
2007-01-01
The present paper presents a longitudinal study of the course "High-tech Entrepreneurship and New Media." The course design is based on socio-cultural theories of learning and considers the role of social capital in entrepreneurial networks. By integrating student teams into the communities of practice of local start-ups, we offer…
Engineering Education through the Latina Lens
ERIC Educational Resources Information Center
Villa, Elsa Q.; Wandermurem, Luciene; Hampton, Elaine M.; Esquinca, Alberto
2016-01-01
Less than 20% of undergraduates earning a degree in engineering are women, and even more alarming is minority women earn a mere 3.1% of those degrees. This paper reports on a qualitative study examining Latinas' identity development toward and in undergraduate engineering and computer science studies using a sociocultural theory of learning. Three…
Information Storage and Retrieval Scientific Report No. ISR-22.
ERIC Educational Resources Information Center
Salton, Gerard
The twenty-second in a series, this report describes research in information organization and retrieval conducted by the Department of Computer Science at Cornell University. The report covers work carried out during the period summer 1972 through summer 1974 and is divided into four parts: indexing theory, automatic content analysis, feedback…
ERIC Educational Resources Information Center
Gilstrap, Donald L.
2013-01-01
In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…
Proof Theory for Authorization Logic and Its Application to a Practical File System
2009-12-01
Holland, 1969. [71] Jean-Yves Girard. Linear logic. Theoretical Computer Science, 50:1–102, 1987 . [72] Jean-Yves Girard, Paul Taylor, and Yves Lafont...2009. Online at http://ecommons.library.cornell.edu/handle/1813/13679. [133] S. Shepler, B. Callaghan, D. Robinson, R. Thurlow, C. Beame, M. Eisler , and
An Analysis of the Structure and Evolution of Networks
ERIC Educational Resources Information Center
Hua, Guangying
2011-01-01
As network research receives more and more attention from both academic researchers and practitioners, network analysis has become a fast growing field attracting many researchers from diverse fields such as physics, computer science, and sociology. This dissertation provides a review of theory and research on different real data sets from the…
Situational Leadership Theory as a Foundation for a Blended Learning Framework
ERIC Educational Resources Information Center
Meier, David
2016-01-01
Ultimately with the raise of computer technology, blended learning has found its way into teaching. The technology continues to evolve, challenging teachers and lecturers alike. Most studies on blended learning focus on the practical or applied side and use essentially pedagogical concepts. This study demonstrates that the leadership sciences can…
ERIC Educational Resources Information Center
Walters, R. A.; Carey, G. F.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. Primary production in aquatic ecosystems is carried out by phytoplankton, microscopic plants…
A Functional Programming Approach to AI Search Algorithms
ERIC Educational Resources Information Center
Panovics, Janos
2012-01-01
The theory and practice of search algorithms related to state-space represented problems form the major part of the introductory course of Artificial Intelligence at most of the universities and colleges offering a degree in the area of computer science. Students usually meet these algorithms only in some imperative or object-oriented language…
Capturing Problem-Solving Processes Using Critical Rationalism
ERIC Educational Resources Information Center
Chitpin, Stephanie; Simon, Marielle
2012-01-01
The examination of problem-solving processes continues to be a current research topic in education. Knowing how to solve problems is not only a key aspect of learning mathematics but is also at the heart of cognitive theories, linguistics, artificial intelligence, and computers sciences. Problem solving is a multistep, higher-order cognitive task…
Cognitive biases, linguistic universals, and constraint-based grammar learning.
Culbertson, Jennifer; Smolensky, Paul; Wilson, Colin
2013-07-01
According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology-the distribution of linguistic patterns across the world's languages-and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern of results of artificial grammar experiments on noun-phrase word order (Culbertson, Smolensky, & Legendre, 2012). Our proposal has several novel properties that distinguish it from prior work in the domains of linguistic theory, computational cognitive science, and machine learning. This study illustrates how ideas from these domains can be synthesized into a model of language learning in which biases range in strength from hard (absolute) to soft (statistical), and in which language-specific and domain-general biases combine to account for data from the macro-level scale of typological distribution to the micro-level scale of learning by individuals. Copyright © 2013 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Clay, London; Menger, Karl; Rota, Gian-Carlo; Euclid, Alexandria; Siegel, Edward
P ≠NP MP proof is by computer-''science''/SEANCE(!!!)(CS) computational-''intelligence'' lingo jargonial-obfuscation(JO) NATURAL-Intelligence(NI) DISambiguation! CS P =(?) =NP MEANS (Deterministic)(PC) = (?) =(Non-D)(PC) i.e. D(P) =(?) = N(P). For inclusion(equality) vs. exclusion (inequality) irrelevant (P) simply cancels!!! (Equally any/all other CCs IF both sides identical). Crucial question left: (D) =(?) =(ND), i.e. D =(?) = N. Algorithmics[Sipser[Intro. Thy.Comp.(`97)-p.49Fig.1.15!!!
NASA Astrophysics Data System (ADS)
Honing, Henkjan; Zuidema, Willem
2014-09-01
The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.
The Analysis of Visual Motion: From Computational Theory to Neuronal Mechanisms.
1986-12-01
neuronb. Brain Res. 151:599-603. Frost, B . J., Nakayama, K . 1983. Single visual neurons code opposing motion independent JW of direction. Science 220:744...Biol. Cybern. 42:195-204. llolden, A. 1. 1977. Responses of directional ganglion cells in the pigeon retina. J. Physiol., 270:2,53 269. Horn. B . K . P...R. Soc. Iond. B . 223:165-175. 51 % Computations Underlying Motion ttildret ik Koch %V. Longuet-Iliggins, H. C., Prazdny. K . 1981. The interpretation
Dougherty, Stephen
2010-01-01
This essay examines the unconscious as modeled by cognitive science and compares it to the psychoanalytic unconscious. In making this comparison, the author underscores the important but usually overlooked fact that computational psychology and psychoanalytic theory are both varieties of posthumanism. He argues that if posthumanism is to advance a vision for our future that is no longer fixated on a normative image of the human, then its own normative claims about the primacy of Darwinian functioning must be disrupted and undermined through a renewed emphasis on its Freudian heritage.
Visual design for the user interface, Part 1: Design fundamentals.
Lynch, P J
1994-01-01
Digital audiovisual media and computer-based documents will be the dominant forms of professional communication in both clinical medicine and the biomedical sciences. The design of highly interactive multimedia systems will shortly become a major activity for biocommunications professionals. The problems of human-computer interface design are intimately linked with graphic design for multimedia presentations and on-line document systems. This article outlines the history of graphic interface design and the theories that have influenced the development of today's major graphic user interfaces.
Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level
NASA Astrophysics Data System (ADS)
Christiansen, Henning
2004-09-01
Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural science or humanities. It has been developed for a course that integrates theoretical material on computer languages and abstract machines with practical programming techniques. Prolog used as meta-language for describing language issues is the central instrument in the approach: Formal descriptions become running prototypes that are easy and appealing to test and modify, and can be extended into analyzers, interpreters, and tools such as tracers and debuggers. Experience shows a high learning curve, especially when the principles are extended into a learning-by-doing approach having the students to develop such descriptions themselves from an informal introduction.
Computational methods to extract meaning from text and advance theories of human cognition.
McNamara, Danielle S
2011-01-01
Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.
Applications of Derandomization Theory in Coding
NASA Astrophysics Data System (ADS)
Cheraghchi, Mahdi
2011-07-01
Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.
Ab initio quantum chemistry: methodology and applications.
Friesner, Richard A
2005-05-10
This Perspective provides an overview of state-of-the-art ab initio quantum chemical methodology and applications. The methods that are discussed include coupled cluster theory, localized second-order Moller-Plesset perturbation theory, multireference perturbation approaches, and density functional theory. The accuracy of each approach for key chemical properties is summarized, and the computational performance is analyzed, emphasizing significant advances in algorithms and implementation over the past decade. Incorporation of a condensed-phase environment by means of mixed quantum mechanical/molecular mechanics or self-consistent reaction field techniques, is presented. A wide range of illustrative applications, focusing on materials science and biology, are discussed briefly.
The FuturICT education accelerator
NASA Astrophysics Data System (ADS)
Johnson, J.; Buckingham Shum, S.; Willis, A.; Bishop, S.; Zamenopoulos, T.; Swithenby, S.; MacKay, R.; Merali, Y.; Lorincz, A.; Costea, C.; Bourgine, P.; Louçã, J.; Kapenieks, A.; Kelley, P.; Caird, S.; Bromley, J.; Deakin Crick, R.; Goldspink, C.; Collet, P.; Carbone, A.; Helbing, D.
2012-11-01
Education is a major force for economic and social wellbeing. Despite high aspirations, education at all levels can be expensive and ineffective. Three Grand Challenges are identified: (1) enable people to learn orders of magnitude more effectively, (2) enable people to learn at orders of magnitude less cost, and (3) demonstrate success by exemplary interdisciplinary education in complex systems science. A ten year `man-on-the-moon' project is proposed in which FuturICT's unique combination of Complexity, Social and Computing Sciences could provide an urgently needed transdisciplinary language for making sense of educational systems. In close dialogue with educational theory and practice, and grounded in the emerging data science and learning analytics paradigms, this will translate into practical tools (both analytical and computational) for researchers, practitioners and leaders; generative principles for resilient educational ecosystems; and innovation for radically scalable, yet personalised, learner engagement and assessment. The proposed Education Accelerator will serve as a `wind tunnel' for testing these ideas in the context of real educational programmes, with an international virtual campus delivering complex systems education exploiting the new understanding of complex, social, computationally enhanced organisational structure developed within FuturICT.
Surface tension and contact angles: Molecular origins and associated microstructure
NASA Technical Reports Server (NTRS)
Davis, H. T.
1982-01-01
Gradient theory converts the molecular theory of inhomogeneous fluid into nonlinear boundary value problems for density and stress distributions in fluid interfaces, contact line regions, nuclei and microdroplets, and other fluid microstructures. The relationship between the basic patterns of fluid phase behavior and the occurrence and stability of fluid microstructures was clearly established by the theory. All the inputs of the theory have molecular expressions which are computable from simple models. On another level, the theory becomes a phenomenological framework in which the equation of state of homogeneous fluid and sets of influence parameters of inhomogeneous fluids are the inputs and the structures, stress tensions and contact angles of menisci are the outputs. These outputs, which find applications in the science and technology of drops and bubbles, are discussed.
The erroneous signals of detection theory.
Trimmer, Pete C; Ehlman, Sean M; McNamara, John M; Sih, Andrew
2017-10-25
Signal detection theory has influenced the behavioural sciences for over 50 years. The theory provides a simple equation that indicates numerous 'intuitive' results; e.g. prey should be more prone to take evasive action (in response to an ambiguous cue) if predators are more common. Here, we use analytical and computational models to show that, in numerous biological scenarios, the standard results of signal detection theory do not apply; more predators can result in prey being less responsive to such cues. The standard results need not apply when the probability of danger pertains not just to the present, but also to future decisions. We identify how responses to risk should depend on background mortality and autocorrelation, and that predictions in relation to animal welfare can also be reversed from the standard theory. © 2017 The Author(s).
NASA Astrophysics Data System (ADS)
Mahootian, F.
2009-12-01
The rapid convergence of advancing sensor technology, computational power, and knowledge discovery techniques over the past decade has brought unprecedented volumes of astronomical data together with unprecedented capabilities of data assimilation and analysis. A key result is that a new, data-driven "observational-inductive'' framework for scientific inquiry is taking shape and proving viable. The anticipated rise in data flow and processing power will have profound effects, e.g., confirmations and disconfirmations of existing theoretical claims both for and against the big bang model. But beyond enabling new discoveries can new data-driven frameworks of scientific inquiry reshape the epistemic ideals of science? The history of physics offers a comparison. The Bohr-Einstein debate over the "completeness'' of quantum mechanics centered on a question of ideals: what counts as science? We briefly examine lessons from that episode and pose questions about their applicability to cosmology. If the history of 20th century physics is any indication, the abandonment of absolutes (e.g., space, time, simultaneity, continuity, determinacy) can produce fundamental changes in understanding. The classical ideal of science, operative in both physics and cosmology, descends from the European Enlightenment. This ideal has for over 200 years guided science to seek the ultimate order of nature, to pursue the absolute theory, the "theory of everything.'' But now that we have new models of scientific inquiry powered by new technologies and driven more by data than by theory, it is time, finally, to relinquish dreams of a "final'' theory.
Liang, Jie; Qian, Hong
2010-01-01
Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand “complex behavior” and complexity theory, and from which important biological insight can be gained. PMID:24999297
Liang, Jie; Qian, Hong
2010-01-01
Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.
ERIC Educational Resources Information Center
Tan, Seng-Chee; Seah, Lay-Hoon
2011-01-01
In this study we explored questioning behaviors among elementary students engaging in inquiry science using the "Knowledge Forum", a computer-supported collaborative learning tool. Adapting the theory of systemic functional linguistics, we developed the Ideational Function of Question (IFQ) analytical framework by means of inductive analysis of…
ERIC Educational Resources Information Center
Kartiko, Iwan; Kavakli, Manolya; Cheng, Ken
2010-01-01
As the technology in computer graphics advances, Animated-Virtual Actors (AVAs) in Virtual Reality (VR) applications become increasingly rich and complex. Cognitive Theory of Multimedia Learning (CTML) suggests that complex visual materials could hinder novice learners from attending to the lesson properly. On the other hand, previous studies have…
ERIC Educational Resources Information Center
Sandoval, William A.; Daniszewski, Kenneth
2004-01-01
This paper explores how two teachers concurrently enacting the same technology-based inquiry unit on evolution structured activity and discourse in their classrooms to connect students' computer-based investigations to formal domain theories. Our analyses show that the teachers' interactions with their students during inquiry were quite similar,…
Deepen the Teaching Reform of Operating System, Cultivate the Comprehensive Quality of Students
ERIC Educational Resources Information Center
Liu, Jianjun
2010-01-01
Operating system is the core course of the specialty of computer science and technology. To understand and master the operating system will directly affect students' further study on other courses. The course of operating system focuses more on theories. Its contents are more abstract and the knowledge system is more complicated. Therefore,…
ERIC Educational Resources Information Center
Gates, David M.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This report introduces two models of the thermal energy budget of a leaf. Typical values for…
ERIC Educational Resources Information Center
Stevenson, R. D.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This report describes concepts presented in another module called "The First Law of…
ERIC Educational Resources Information Center
Stevenson, R. D.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module and a comparison module are concerned with elementary concepts of thermodynamics as…
ERIC Educational Resources Information Center
Levin, Michael; Gallucci, V. F.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module describes the application of irreversible thermodynamics to biology. It begins with…
New Theoretical Frameworks for Machine Learning
2008-09-15
New York, 1974. 6.3 [52] G.M. Benedek and A. Itai . Learnability by fixed distributions. In Proc. 1st Workshop Computat. Learning Theory, pages 80–90...1988. 3.4.3 [53] G.M. Benedek and A. Itai . Learnability with respect to a fixed distribution. Theoretical Computer Science, 86:377–389, 1991. 2.1, 2.1.1
The Six Core Theories of Modern Physics
NASA Astrophysics Data System (ADS)
Stevens, Charles F.
1996-09-01
Charles Stevens, a prominent neurobiologist who originally trained as a biophysicist (with George Uhlenbeck and Mark Kac), wrote this book almost by accident. Each summer he found himself reviewing key areas of physics that he had once known and understood well, for use in his present biological research. Since there was no book, he created his own set of notes, which formed the basis for this brief, clear, and self-contained summary of the basic theoretical structures of classical mechanics, electricity and magnetism, quantum mechanics, statistical physics, special relativity, and quantum field theory. The Six Core Theories of Modern Physics can be used by advanced undergraduates or beginning graduate students as a supplement to the standard texts or for an uncluttered, succinct review of the key areas. Professionals in such quantitative sciences as chemistry, engineering, computer science, applied mathematics, and biophysics who need to brush up on the essentials of a particular area will find most of the required background material, including the mathematics.
NASA Astrophysics Data System (ADS)
Huhn, William Paul; Lange, Björn; Yu, Victor; Blum, Volker; Lee, Seyong; Yoon, Mina
Density-functional theory has been well established as the dominant quantum-mechanical computational method in the materials community. Large accurate simulations become very challenging on small to mid-scale computers and require high-performance compute platforms to succeed. GPU acceleration is one promising approach. In this talk, we present a first implementation of all-electron density-functional theory in the FHI-aims code for massively parallel GPU-based platforms. Special attention is paid to the update of the density and to the integration of the Hamiltonian and overlap matrices, realized in a domain decomposition scheme on non-uniform grids. The initial implementation scales well across nodes on ORNL's Titan Cray XK7 supercomputer (8 to 64 nodes, 16 MPI ranks/node) and shows an overall speed up in runtime due to utilization of the K20X Tesla GPUs on each Titan node of 1.4x, with the charge density update showing a speed up of 2x. Further acceleration opportunities will be discussed. Work supported by the LDRD Program of ORNL managed by UT-Battle, LLC, for the U.S. DOE and by the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.
NASA Astrophysics Data System (ADS)
Masson, Steve; Vázquez-Abad, Jesús
2006-10-01
This paper proposes a new way to integrate history of science in science education to promote conceptual change by introducing the notion of historical microworld, which is a computer-based interactive learning environment respecting historic conceptions. In this definition, "interactive" means that the user can act upon the virtual environment by changing some parameters to see what ensues. "Environment respecting historic conceptions" means that the "world" has been programmed to respect the conceptions of past scientists or philosophers. Three historical microworlds in the field of mechanics are presented in this article: an Aristotelian microworld respecting Aristotle's conceptions about movement, a Buridanian microworld respecting the theory of impetus and, finally, a Newtonian microworld respecting Galileo's conceptions and Newton's laws of movement.
Innovation Research in E-Learning
NASA Astrophysics Data System (ADS)
Wu, Bing; Xu, WenXia; Ge, Jun
This study is a productivity review on the literature gleaned from SSCI, SCIE databases concerning innovation research in E-Learning. The result indicates that the number of literature productions on innovation research in ELearning is still growing from 2005. The main research development country is England, and from the analysis of the publication year, the number of papers is increasing peaking in 25% of the total in 2010. Meanwhile the main source title is British Journal of Educational Technology. In addition the subject area concentrated on Education & Educational Research, Computer Science, Interdisciplinary Applications and Computer Science, Software Engineering. Moreover the research focuses on are mainly conceptual research and empirical research, which were used to explore E-Learning in respective of innovation diffusion theory, also the limitations and future research of these research were discussed for further research.
[The Durkheim Test. Remarks on Susan Leigh Star's Boundary Objects].
Gießmann, Sebastian
2015-09-01
The article reconstructs Susan Leigh Star's conceptual work on the notion of 'boundary objects'. It traces the emergence of the concept, beginning with her PhD thesis and its publication as Regions of the Mind in 1989. 'Boundary objects' attempt to represent the distributed, multifold nature of scientific work and its mediations between different 'social worlds'. Being addressed to several 'communities of practice', the term responded to questions from Distributed Artificial Intelligence in Computer Science, Workplace Studies and Computer Supported Cooperative Work (CSCW), and microhistorical approaches inside the growing Science and Technology Studies. Yet the interdisciplinary character and interpretive flexibility of Star’s invention has rarely been noticed as a conceptual tool for media theory. I therefore propose to reconsider Star's 'Durkheim test' for sociotechnical media practices.
Neuhauser, Linda; Kreps, Gary L
2014-12-01
Traditional communication theory and research methods provide valuable guidance about designing and evaluating health communication programs. However, efforts to use health communication programs to educate, motivate, and support people to adopt healthy behaviors often fail to meet the desired goals. One reason for this failure is that health promotion issues are complex, changeable, and highly related to the specific needs and contexts of the intended audiences. It is a daunting challenge to effectively influence health behaviors, particularly culturally learned and reinforced behaviors concerning lifestyle factors related to diet, exercise, and substance (such as alcohol and tobacco) use. Too often, program development and evaluation are not adequately linked to provide rapid feedback to health communication program developers so that important revisions can be made to design the most relevant and personally motivating health communication programs for specific audiences. Design science theory and methods commonly used in engineering, computer science, and other fields can address such program and evaluation weaknesses. Design science researchers study human-created programs using tightly connected build-and-evaluate loops in which they use intensive participatory methods to understand problems and develop solutions concurrently and throughout the duration of the program. Such thinking and strategies are especially relevant to address complex health communication issues. In this article, the authors explore the history, scientific foundation, methods, and applications of design science and its potential to enhance health communication programs and their evaluation.
Representations of Complexity: How Nature Appears in Our Theories
2013-01-01
In science we study processes in the material world. The way these processes operate can be discovered by conducting experiments that activate them, and findings from such experiments can lead to functional complexity theories of how the material processes work. The results of a good functional theory will agree with experimental measurements, but the theory may not incorporate in its algorithmic workings a representation of the material processes themselves. Nevertheless, the algorithmic operation of a good functional theory may be said to make contact with material reality by incorporating the emergent computations the material processes carry out. These points are illustrated in the experimental analysis of behavior by considering an evolutionary theory of behavior dynamics, the algorithmic operation of which does not correspond to material features of the physical world, but the functional output of which agrees quantitatively and qualitatively with findings from a large body of research with live organisms. PMID:28018044
The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking
NASA Astrophysics Data System (ADS)
Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng
Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).
Situation resolution with context-sensitive fuzzy relations
NASA Astrophysics Data System (ADS)
Jakobson, Gabriel; Buford, John; Lewis, Lundy
2009-05-01
Context plays a significant role in situation resolution by intelligent agents (human or machine) by affecting how the situations are recognized, interpreted, acted upon or predicted. Many definitions and formalisms for the notion of context have emerged in various research fields including psychology, economics and computer science (computational linguistics, data management, control theory, artificial intelligence and others). In this paper we examine the role of context in situation management, particularly how to resolve situations that are described by using fuzzy (inexact) relations among their components. We propose a language for describing context sensitive inexact constraints and an algorithm for interpreting relations using inexact (fuzzy) computations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saad, Yousef
2014-03-19
The master project under which this work is funded had as its main objective to develop computational methods for modeling electronic excited-state and optical properties of various nanostructures. The specific goals of the computer science group were primarily to develop effective numerical algorithms in Density Functional Theory (DFT) and Time Dependent Density Functional Theory (TDDFT). There were essentially four distinct stated objectives. The first objective was to study and develop effective numerical algorithms for solving large eigenvalue problems such as those that arise in Density Functional Theory (DFT) methods. The second objective was to explore so-called linear scaling methods ormore » Methods that avoid diagonalization. The third was to develop effective approaches for Time-Dependent DFT (TDDFT). Our fourth and final objective was to examine effective solution strategies for other problems in electronic excitations, such as the GW/Bethe-Salpeter method, and quantum transport problems.« less
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1984-01-01
Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.
Introduction to the focus issue: fifty years of chaos: applied and theoretical.
Hikihara, Takashi; Holmes, Philip; Kambe, Tsutomu; Rega, Giuseppe
2012-12-01
The discovery of deterministic chaos in the late nineteenth century, its subsequent study, and the development of mathematical and computational methods for its analysis have substantially influenced the sciences. Chaos is, however, only one phenomenon in the larger area of dynamical systems theory. This Focus Issue collects 13 papers, from authors and research groups representing the mathematical, physical, and biological sciences, that were presented at a symposium held at Kyoto University from November 28 to December 2, 2011. The symposium, sponsored by the International Union of Theoretical and Applied Mechanics, was called 50 Years of Chaos: Applied and Theoretical. Following some historical remarks to provide a background for the last 50 years, and for chaos, this Introduction surveys the papers and identifies some common themes that appear in them and in the theory of dynamical systems.
Decision theory with resource-bounded agents.
Halpern, Joseph Y; Pass, Rafael; Seeman, Lior
2014-04-01
There have been two major lines of research aimed at capturing resource-bounded players in game theory. The first, initiated by Rubinstein (), charges an agent for doing costly computation; the second, initiated by Neyman (), does not charge for computation, but limits the computation that agents can do, typically by modeling agents as finite automata. We review recent work on applying both approaches in the context of decision theory. For the first approach, we take the objects of choice in a decision problem to be Turing machines, and charge players for the "complexity" of the Turing machine chosen (e.g., its running time). This approach can be used to explain well-known phenomena like first-impression-matters biases (i.e., people tend to put more weight on evidence they hear early on) and belief polarization (two people with different prior beliefs, hearing the same evidence, can end up with diametrically opposed conclusions) as the outcomes of quite rational decisions. For the second approach, we model people as finite automata, and provide a simple algorithm that, on a problem that captures a number of settings of interest, provably performs optimally as the number of states in the automaton increases. Copyright © 2014 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Assadi, Amir H.
2001-11-01
Perceptual geometry is an emerging field of interdisciplinary research whose objectives focus on study of geometry from the perspective of visual perception, and in turn, apply such geometric findings to the ecological study of vision. Perceptual geometry attempts to answer fundamental questions in perception of form and representation of space through synthesis of cognitive and biological theories of visual perception with geometric theories of the physical world. Perception of form and space are among fundamental problems in vision science. In recent cognitive and computational models of human perception, natural scenes are used systematically as preferred visual stimuli. Among key problems in perception of form and space, we have examined perception of geometry of natural surfaces and curves, e.g. as in the observer's environment. Besides a systematic mathematical foundation for a remarkably general framework, the advantages of the Gestalt theory of natural surfaces include a concrete computational approach to simulate or recreate images whose geometric invariants and quantities might be perceived and estimated by an observer. The latter is at the very foundation of understanding the nature of perception of space and form, and the (computer graphics) problem of rendering scenes to visually invoke virtual presence.
Probability for Weather and Climate
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.
Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing
NASA Astrophysics Data System (ADS)
Krajíček, Jiří
This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].
Computing through Scientific Abstractions in SysBioPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Stephan, Eric G.; Gracio, Deborah K.
2004-10-13
Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less
Extending Landauer's bound from bit erasure to arbitrary computation
NASA Astrophysics Data System (ADS)
Wolpert, David
The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.
Modeling Reality - How Computers Mirror Life
NASA Astrophysics Data System (ADS)
Bialynicki-Birula, Iwo; Bialynicka-Birula, Iwona
2005-01-01
The bookModeling Reality covers a wide range of fascinating subjects, accessible to anyone who wants to learn about the use of computer modeling to solve a diverse range of problems, but who does not possess a specialized training in mathematics or computer science. The material presented is pitched at the level of high-school graduates, even though it covers some advanced topics (cellular automata, Shannon's measure of information, deterministic chaos, fractals, game theory, neural networks, genetic algorithms, and Turing machines). These advanced topics are explained in terms of well known simple concepts: Cellular automata - Game of Life, Shannon's formula - Game of twenty questions, Game theory - Television quiz, etc. The book is unique in explaining in a straightforward, yet complete, fashion many important ideas, related to various models of reality and their applications. Twenty-five programs, written especially for this book, are provided on an accompanying CD. They greatly enhance its pedagogical value and make learning of even the more complex topics an enjoyable pleasure.
ERIC Educational Resources Information Center
Kiboss, Joel K.; Ndirangu, Mwangi; Wekesa, Eric W.
2004-01-01
Biology knowledge and understanding is important not only for the conversion of the loftiest dreams into reality for a better life of individuals but also for preparing secondary pupils for such fields as agriculture, medicine, biotechnology, and genetic engineering. But a recent study has revealed that many aspects of school science (biology…
Working in a Text Mine; Is Access about to Go down?
ERIC Educational Resources Information Center
Emery, Jill
2008-01-01
The age of networked research and networked data analysis is upon us. "Wired Magazine" proclaims on the cover of their July 2008 issue: "The End of Science. The quest for knowledge used to begin with grand theories. Now it begins with massive amounts of data. Welcome to the Petabyte Age." Computing technology is sufficiently complex at this point…
ERIC Educational Resources Information Center
Hatheway, W. H.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. Specifically, this module develops a method for calculating the exchange of heat between an…
ERIC Educational Resources Information Center
Cowan, Christina E.
This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module deals specifically with concepts that are basic to fluid flow and…
ERIC Educational Resources Information Center
Simpson, James R.
This module is part of a series on Physical Processes in Terrestrial and Aquatic Ecosystems. The materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process.…
ERIC Educational Resources Information Center
Cowan, Christina E.
This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module explores some of the characteristics of aquatic organisms which can be…
Bridging the Gap: A Manual Primer into Design Computing in the Context of Basic Design Education
ERIC Educational Resources Information Center
Uysal, V. Safak; Topaloglu, Fulden
2017-01-01
Design education is in need of a wider restructuring to accommodate new developments and paradigmatic shifts brought forth by the information age, all of which capitalise a move towards complexity theory, systems science and digital technologies. The intention of this article is to approach one particular aspect of this need: that is, how basic…
Making objective decisions in mechanical engineering problems
NASA Astrophysics Data System (ADS)
Raicu, A.; Oanta, E.; Sabau, A.
2017-08-01
Decision making process has a great influence in the development of a given project, the goal being to select an optimal choice in a given context. Because of its great importance, the decision making was studied using various science methods, finally being conceived the game theory that is considered the background for the science of logical decision making in various fields. The paper presents some basic ideas regarding the game theory in order to offer the necessary information to understand the multiple-criteria decision making (MCDM) problems in engineering. The solution is to transform the multiple-criteria problem in a one-criterion decision problem, using the notion of utility, together with the weighting sum model or the weighting product model. The weighted importance of the criteria is computed using the so-called Step method applied to a relation of preferences between the criteria. Two relevant examples from engineering are also presented. The future directions of research consist of the use of other types of criteria, the development of computer based instruments for decision making general problems and to conceive a software module based on expert system principles to be included in the Wiki software applications for polymeric materials that are already operational.
NASA Technical Reports Server (NTRS)
Young, Gerald W.; Clemons, Curtis B.
2004-01-01
The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.
Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing
NASA Astrophysics Data System (ADS)
Chine, Karim
The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Marcel D.; Kuo, I-F W.; Tobias, Douglas J.
2014-07-17
The propensities of the water self ions, H3O+ and OH- , for the air-water interface has implications for interfacial acid-base chemistry. Despite numerous experimental and computational studies, no consensus has been reached on the question of whether or not H3O+ and/or OH- prefer to be at the water surface or in the bulk. Here we report a molecular dynamics simulation study of the bulk vs. interfacial behavior of H3O+ and OH- that employs forces derived from density functional theory with a generalized gradient approximation exchangecorrelation functional (specifically, BLYP) and empirical dispersion corrections. We computed the potential of mean force (PMF)more » for H3O+ as a function of the position of the ion in a 215-molecule water slab. The PMF is flat, suggesting that H3O+ has equal propensity for the air-water interface and the bulk. We compare the PMF for H3O+ to our previously computed PMF for OH- adsorption, which contains a shallow minimum at the interface, and we explore how differences in solvation of each ion at the interface vs. the bulk are connected with interfacial propensity. We find that the solvation shell of H3O+ is only slightly dependent on its position in the water slab, while OH- partially desolvates as it approaches the interface, and we examine how this difference in solvation behavior is manifested in the electronic structure and chemistry of the two ions. DJT was supported by National Science Foundation grant CHE-0909227. CJM was supported by the U.S. Department of Energy‘s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is operated for the Department of Energy by Battelle. The potential of mean force required resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DEAC05-00OR22725. The remaining simulations and analysis used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. at at Lawrence Berkeley National Laboratory. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL.« less
NASA Astrophysics Data System (ADS)
The Naval Research Laboratory (Washington, D.C.) formed the Space Plasma Branch within its Plasma Physics Division on July 1. Vithal Patel, former Program Director of Magnetospheric Physics, National Science Foundation, also joined NRL on the same date as Associate Superintendent of the Plasma Physics Division. Barret Ripin is head of the newly organized branch. The Space Plasma branch will do basic and applied space plasma research using a multidisciplinary approach. It consolidates traditional rocket and satellite space experiments, space plasma theory and computation, with laboratory space-related experiments. About 40 research scientists, postdoctoral fellows, engineers, and technicians are divided among its five sections. The Theory and Computation sections are led by Joseph Huba and Joel Fedder, the Space Experiments section is led by Paul Rodriguez, and the Pharos Laser Facility and Laser Experiments sections are headed by Charles Manka and Jacob Grun.
Maximal aggregation of polynomial dynamical systems
Cardelli, Luca; Tschaikowski, Max
2017-01-01
Ordinary differential equations (ODEs) with polynomial derivatives are a fundamental tool for understanding the dynamics of systems across many branches of science, but our ability to gain mechanistic insight and effectively conduct numerical evaluations is critically hindered when dealing with large models. Here we propose an aggregation technique that rests on two notions of equivalence relating ODE variables whenever they have the same solution (backward criterion) or if a self-consistent system can be written for describing the evolution of sums of variables in the same equivalence class (forward criterion). A key feature of our proposal is to encode a polynomial ODE system into a finitary structure akin to a formal chemical reaction network. This enables the development of a discrete algorithm to efficiently compute the largest equivalence, building on approaches rooted in computer science to minimize basic models of computation through iterative partition refinements. The physical interpretability of the aggregation is shown on polynomial ODE systems for biochemical reaction networks, gene regulatory networks, and evolutionary game theory. PMID:28878023
Functional Programming in Computer Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Loren James; Davis, Marion Kei
We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functionalmore » language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.« less
A Computational Model of Linguistic Humor in Puns.
Kao, Justine T; Levy, Roger; Goodman, Noah D
2016-07-01
Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we propose two information-theoretic measures-ambiguity and distinctiveness-derived from a simple model of sentence processing. We test these measures on a set of puns and regular sentences and show that they correlate significantly with human judgments of funniness. Moreover, within a set of puns, the distinctiveness measure distinguishes exceptionally funny puns from mediocre ones. Our work is the first, to our knowledge, to integrate a computational model of general language understanding and humor theory to quantitatively predict humor at a fine-grained level. We present it as an example of a framework for applying models of language processing to understand higher level linguistic and cognitive phenomena. © 2015 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
PREFACE: Theory, Modelling and Computational methods for Semiconductors
NASA Astrophysics Data System (ADS)
Migliorato, Max; Probert, Matt
2010-04-01
These conference proceedings contain the written papers of the contributions presented at the 2nd International Conference on: Theory, Modelling and Computational methods for Semiconductors. The conference was held at the St Williams College, York, UK on 13th-15th Jan 2010. The previous conference in this series took place in 2008 at the University of Manchester, UK. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in Semiconductor science and technology, where there is a substantial potential for time saving in R&D. The development of high speed computer architectures is finally allowing the routine use of accurate methods for calculating the structural, thermodynamic, vibrational and electronic properties of semiconductors and their heterostructures. This workshop ran for three days, with the objective of bringing together UK and international leading experts in the field of theory of group IV, III-V and II-VI semiconductors together with postdocs and students in the early stages of their careers. The first day focused on providing an introduction and overview of this vast field, aimed particularly at students at this influential point in their careers. We would like to thank all participants for their contribution to the conference programme and these proceedings. We would also like to acknowledge the financial support from the Institute of Physics (Computational Physics group and Semiconductor Physics group), the UK Car-Parrinello Consortium, Accelrys (distributors of Materials Studio) and Quantumwise (distributors of Atomistix). The Editors Acknowledgements Conference Organising Committee: Dr Matt Probert (University of York) and Dr Max Migliorato (University of Manchester) Programme Committee: Dr Marco Califano (University of Leeds), Dr Jacob Gavartin (Accelrys Ltd, Cambridge), Dr Stanko Tomic (STFC Daresbury Laboratory), Dr Gabi Slavcheva (Imperial College London) Proceedings edited and compiled by Dr Max Migliorato and Dr Matt Probert
NASA Astrophysics Data System (ADS)
Buongiorno Nardelli, Marco
High-Throughput Quantum-Mechanics computation of materials properties by ab initio methods has become the foundation of an effective approach to materials design, discovery and characterization. This data driven approach to materials science currently presents the most promising path to the development of advanced technological materials that could solve or mitigate important social and economic challenges of the 21st century. In particular, the rapid proliferation of computational data on materials properties presents the possibility to complement and extend materials property databases where the experimental data is lacking and difficult to obtain. Enhanced repositories such as AFLOWLIB open novel opportunities for structure discovery and optimization, including uncovering of unsuspected compounds, metastable structures and correlations between various properties. The practical realization of these opportunities depends almost exclusively on the the design of efficient algorithms for electronic structure simulations of realistic material systems beyond the limitations of the current standard theories. In this talk, I will review recent progress in theoretical and computational tools, and in particular, discuss the development and validation of novel functionals within Density Functional Theory and of local basis representations for effective ab-initio tight-binding schemes. Marco Buongiorno Nardelli is a pioneer in the development of computational platforms for theory/data/applications integration rooted in his profound and extensive expertise in the design of electronic structure codes and in his vision for sustainable and innovative software development for high-performance materials simulations. His research activities range from the design and discovery of novel materials for 21st century applications in renewable energy, environment, nano-electronics and devices, the development of advanced electronic structure theories and high-throughput techniques in materials genomics and computational materials design, to an active role as community scientific software developer (QUANTUM ESPRESSO, WanT, AFLOWpi)
New theory insights and experimental opportunities in Majorana wires
NASA Astrophysics Data System (ADS)
Alicea, Jason
Over the past decade, the quest for Majorana zero modes in exotic superconductors has undergone transformational advances on the design, fabrication, detection, and characterization fronts. The field now seems primed for a new era aimed at Majorana control and readout. This talk will survey intertwined theory and experimental developments that illuminate a practical path toward these higher-level goals. In particular, I will highlight near-term opportunities for testing fundamentals of topological quantum computing and longer-term strategies for building scalable hardware. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.
On agent-based modeling and computational social science.
Conte, Rosaria; Paolucci, Mario
2014-01-01
In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS.
GW Calculations of Materials on the Intel Xeon-Phi Architecture
NASA Astrophysics Data System (ADS)
Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek; Biller, Ariel; Chelikowsky, James R.; Louie, Steven G.
Intel Xeon-Phi processors are expected to power a large number of High-Performance Computing (HPC) systems around the United States and the world in the near future. We evaluate the ability of GW and pre-requisite Density Functional Theory (DFT) calculations for materials on utilizing the Xeon-Phi architecture. We describe the optimization process and performance improvements achieved. We find that the GW method, like other higher level Many-Body methods beyond standard local/semilocal approximations to Kohn-Sham DFT, is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-waves, band-pairs and frequencies. Support provided by the SCIDAC program, Department of Energy, Office of Science, Advanced Scientic Computing Research and Basic Energy Sciences. Grant Numbers DE-SC0008877 (Austin) and DE-AC02-05CH11231 (LBNL).
On agent-based modeling and computational social science
Conte, Rosaria; Paolucci, Mario
2014-01-01
In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642
NASA Technical Reports Server (NTRS)
Koh, Severino L. (Editor); Speziale, Charles G. (Editor)
1989-01-01
Various papers on recent advances in engineering science are presented. Some individual topics addressed include: advances in adaptive methods in computational fluid mechanics, mixtures of two medicomorphic materials, computer tests of rubber elasticity, shear bands in isotropic micropolar elastic materials, nonlinear surface wave and resonator effects in magnetostrictive crystals, simulation of electrically enhanced fibrous filtration, plasticity theory of granular materials, dynamics of viscoelastic media with internal oscillators, postcritical behavior of a cantilever bar, boundary value problems in nonlocal elasticity, stability of flexible structures with random parameters, electromagnetic tornadoes in earth's ionosphere and magnetosphere, helicity fluctuations and the energy cascade in turbulence, mechanics of interfacial zones in bonded materials, propagation of a normal shock in a varying area duct, analytical mechanics of fracture and fatigue.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duignan, Timothy T.; Baer, Marcel D.; Schenter, Gregory K.
Determining the solvation free energies of single ions in water is one of the most fundamental problems in physical chemistry and yet many unresolved questions remain. In particular, the ability to decompose the solvation free energy into simple and intuitive contributions will have important implications for coarse grained models of electrolyte solution. Here, we provide rigorous definitions of the various types of single ion solvation free energies based on different simulation protocols. We calculate solvation free energies of charged hard spheres using density functional theory interaction potentials with molecular dynamics simulation (DFT-MD) and isolate the effects of charge and cavitation,more » comparing to the Born (linear response) model. We show that using uncorrected Ewald summation leads to highly unphysical values for the solvation free energy and that charging free energies for cations are approximately linear as a function of charge but that there is a small non-linearity for small anions. The charge hydration asymmetry (CHA) for hard spheres, determined with quantum mechanics, is much larger than for the analogous real ions. This suggests that real ions, particularly anions, are significantly more complex than simple charged hard spheres, a commonly employed representation. We would like to thank Thomas Beck, Shawn Kathmann, Richard Remsing and John Weeks for helpful discussions. Computing resources were generously allocated by PNNL's Institutional Computing program. This research also used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. TTD, GKS, and CJM were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). PNNL is a multi-program national laboratory operated by Battelle for the U.S. Department of Energy.« less
NASA Astrophysics Data System (ADS)
Quigley, Mark Declan
The purpose of this researcher was to examine specific environmental, educational, and demographic factors and their influence on mathematics and science achievement. In particular, the researcher ascertained the interconnections of home computer access and social capital, with Asian American students and the effect on mathematics and science achievement. Coleman's theory on social capital and parental influence was used as a basis for the analysis of data. Subjects for this study were the base year students from the National Education Longitudinal Study of 1988 (NELS:88) and the subsequent follow-up survey data in 1990, 1992, and 1994. The approximate sample size for this study is 640 ethnic Asians from the NELS:88 database. The analysis was a longitudinal study based on the Student and Parent Base Year responses and the Second Follow-up survey of 1992, when the subjects were in 12th grade. Achievement test results from the NELS:88 data were used to measure achievement in mathematics and science. The NELS:88 test battery was developed to measure both individual status and a student's growth in a number of achievement areas. The subject's responses were analyzed by principal components factor analysis, weights, effect sizes, hierarchial regression analysis, and PLSPath Analysis. The results of this study were that prior ability in mathematics and science is a major influence in the student's educational achievement. Findings from the study support the view that home computer access has a negative direct effect on mathematics and science achievement for both Asian American males and females. None of the social capital factors in the study had either a negative or positive direct effect on mathematics and science achievement although some indirect effects were found. Suggestions were made toward increasing parental involvement in their children's academic endeavors. Computer access in the home should be considered related to television viewing and should be closely monitored by the parents to promote educational uses.
Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael
2018-06-01
The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. Copyright © 2018 Cognitive Science Society, Inc.
Hill's Heuristics and Explanatory Coherentism in Epidemiology.
Dammann, Olaf
2018-01-01
In this essay, I argue that Ted Poston's theory of explanatory coherentism is well-suited as a tool for causal explanation in the health sciences, particularly in epidemiology. Coherence has not only played a role in epidemiology for more than half a century as one of Hill's viewpoints, it can also provide background theory for the development of explanatory systems by integrating epidemiologic evidence with a diversity of other error-independent data. I propose that computational formalization of Hill's viewpoints in an explanatory coherentist framework would provide an excellent starting point for a formal epistemological (knowledge-theoretical) project designed to improve causal explanation in the health sciences. As an example, I briefly introduce Paul Thagard's ECHO system and offer my responses to possible objections to my proposal. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Martin, D.; Drugan, C.
2010-11-23
This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less
Applying the Coupled-Cluster Ansatz to Solids and Surfaces in the Thermodynamic Limit
NASA Astrophysics Data System (ADS)
Gruber, Thomas; Liao, Ke; Tsatsoulis, Theodoros; Hummel, Felix; Grüneis, Andreas
2018-04-01
Modern electronic structure theories can predict and simulate a wealth of phenomena in surface science and solid-state physics. In order to allow for a direct comparison with experiment, such ab initio predictions have to be made in the thermodynamic limit, substantially increasing the computational cost of many-electron wave-function theories. Here, we present a method that achieves thermodynamic limit results for solids and surfaces using the "gold standard" coupled cluster ansatz of quantum chemistry with unprecedented efficiency. We study the energy difference between carbon diamond and graphite crystals, adsorption energies of water on h -BN, as well as the cohesive energy of the Ne solid, demonstrating the increased efficiency and accuracy of coupled cluster theory for solids and surfaces.
The End of Theory? Does the Data Deluge Make the Scientific Method Obsolete?
NASA Astrophysics Data System (ADS)
Kreinovich, Vladik; McClure, John; Symons, John
2008-10-01
Why do we need theory? One of the purposes of science is to predict: e.g., how a complex material behaves in different situations. There are a lot of records describing how different materials behave in different situations. In the past, it was not possible to find a similar record and simply recall what happened then. The only possibility was to extract, from the data, a simple dependence, and then use this dependence for predictions. For example, we can use Ohm's law V=I.R to predict the voltage V based on the current I and the resistance R. Nowadays, computer searches are so fast that there seems to be no need for any theoretical laws anymore: if we want to predict, we can simply search through all the records and find what happened in a similar situation. So maybe we do not need theory at all. This was the argument developed in a recent (June 2008) article in a popular Wired magazine. In our presentation, we will describe this argument in detail, and give our opinion on whether the computer progress will indeed lead to the end of the theory as we know it.
NASA Astrophysics Data System (ADS)
Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly
2014-05-01
Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological safety of coastal and shelf zones and complex use of shelf resources: Collection of scientific works. Issue 26, Volume 2. - National Academy of Sciences of Ukraine, Marine Hydrophysical Institute, Sebastopol, 2012. Pages 352-360. (In russian)
The Center for Nanophase Materials Sciences
NASA Astrophysics Data System (ADS)
Lowndes, Douglas
2005-03-01
The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.
1993 Annual report on scientific programs: A broad research program on the sciences of complexity
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-12-31
This report provides a summary of many of the research projects completed by the Santa Fe Institute (SFI) during 1993. These research efforts continue to focus on two general areas: the study of, and search for, underlying scientific principles governing complex adaptive systems, and the exploration of new theories of computation that incorporate natural mechanisms of adaptation (mutation, genetics, evolution).
ERIC Educational Resources Information Center
Stevenson, R. D.
This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module describes heat transfer processes involved in the exchange of heat…
Proceedings of the workshop on B physics at hadron accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
McBride, P.; Mishra, C.S.
1993-12-31
This report contains papers on the following topics: Measurement of Angle {alpha}; Measurement of Angle {beta}; Measurement of Angle {gamma}; Other B Physics; Theory of Heavy Flavors; Charged Particle Tracking and Vertexing; e and {gamma} Detection; Muon Detection; Hadron ID; Electronics, DAQ, and Computing; and Machine Detector Interface. Selected papers have been indexed separately for inclusion the in Energy Science and Technology Database.
ERIC Educational Resources Information Center
Luse, Andy; Rursch, Julie A.; Jacobson, Doug
2014-01-01
In the United States, the number of students entering into and completing degrees in science, technology, engineering, and mathematics (STEM) areas has declined significantly over the past decade. Although modest increases have been shown in enrollments in computer-related majors in the past 4 years, the prediction is that even in 3 to 4 years…
Signal Detection Analysis of Computer Enhanced Group Decision Making Strategies
2007-11-01
group decision making. 20 References American Psychological Association (2002). Ethical principles of psychologists and code of conduct. American... Creelman , C. D. (2005). Detection theory: A user’s guide (2nd ed.). Mahwah, NJ: Lawrence Erlbaum. Sorkin, R. D. (1998). Group performance depends on...the majority rule. Psychological Science, 9, 456-463. Sorkin, R. D. (2001). Signal-detection analysis of group decision making. Psychological
Bayesian Nonlinear Assimilation of Eulerian and Lagrangian Coastal Flow Data
2015-09-30
Lagrangian Coastal Flow Data Dr. Pierre F.J. Lermusiaux Department of Mechanical Engineering Center for Ocean Science and Engineering Massachusetts...Develop and apply theory, schemes and computational systems for rigorous Bayesian nonlinear assimilation of Eulerian and Lagrangian coastal flow data...coastal ocean fields, both in Eulerian and Lagrangian forms. - Further develop and implement our GMM-DO schemes for robust Bayesian nonlinear estimation
ERIC Educational Resources Information Center
Stevenson, R. D.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. Several modules in the thermodynamic series considered the application of the First Law to…
Fuzzy Logic and Education: Teaching the Basics of Fuzzy Logic through an Example (By Way of Cycling)
ERIC Educational Resources Information Center
Sobrino, Alejandro
2013-01-01
Fuzzy logic dates back to 1965 and it is related not only to current areas of knowledge, such as Control Theory and Computer Science, but also to traditional ones, such as Philosophy and Linguistics. Like any logic, fuzzy logic is concerned with argumentation, but unlike other modalities, which focus on the crisp reasoning of Mathematics, it deals…
Holographic description of a quantum black hole on a computer.
Hanada, Masanori; Hyakutake, Yoshifumi; Ishiki, Goro; Nishimura, Jun
2014-05-23
Black holes have been predicted to radiate particles and eventually evaporate, which has led to the information loss paradox and implies that the fundamental laws of quantum mechanics may be violated. Superstring theory, a consistent theory of quantum gravity, provides a possible solution to the paradox if evaporating black holes can actually be described in terms of standard quantum mechanical systems, as conjectured from the theory. Here, we test this conjecture by calculating the mass of a black hole in the corresponding quantum mechanical system numerically. Our results agree well with the prediction from gravity theory, including the leading quantum gravity correction. Our ability to simulate black holes offers the potential to further explore the yet mysterious nature of quantum gravity through well-established quantum mechanics. Copyright © 2014, American Association for the Advancement of Science.
Hafner, Jürgen
2010-09-29
During the last 20 years computer simulations based on a quantum-mechanical description of the interactions between electrons and atomic nuclei have developed an increasingly important impact on materials science, not only in promoting a deeper understanding of the fundamental physical phenomena, but also enabling the computer-assisted design of materials for future technologies. The backbone of atomic-scale computational materials science is density-functional theory (DFT) which allows us to cast the intractable complexity of electron-electron interactions into the form of an effective single-particle equation determined by the exchange-correlation functional. Progress in DFT-based calculations of the properties of materials and of simulations of processes in materials depends on: (1) the development of improved exchange-correlation functionals and advanced post-DFT methods and their implementation in highly efficient computer codes, (2) the development of methods allowing us to bridge the gaps in the temperature, pressure, time and length scales between the ab initio calculations and real-world experiments and (3) the extension of the functionality of these codes, permitting us to treat additional properties and new processes. In this paper we discuss the current status of techniques for performing quantum-based simulations on materials and present some illustrative examples of applications to complex quasiperiodic alloys, cluster-support interactions in microporous acid catalysts and magnetic nanostructures.
The emergence of mind and brain: an evolutionary, computational, and philosophical approach.
Mainzer, Klaus
2008-01-01
Modern philosophy of mind cannot be understood without recent developments in computer science, artificial intelligence (AI), robotics, neuroscience, biology, linguistics, and psychology. Classical philosophy of formal languages as well as symbolic AI assume that all kinds of knowledge must explicitly be represented by formal or programming languages. This assumption is limited by recent insights into the biology of evolution and developmental psychology of the human organism. Most of our knowledge is implicit and unconscious. It is not formally represented, but embodied knowledge, which is learnt by doing and understood by bodily interacting with changing environments. That is true not only for low-level skills, but even for high-level domains of categorization, language, and abstract thinking. The embodied mind is considered an emergent capacity of the brain as a self-organizing complex system. Actually, self-organization has been a successful strategy of evolution to handle the increasing complexity of the world. Genetic programs are not sufficient and cannot prepare the organism for all kinds of complex situations in the future. Self-organization and emergence are fundamental concepts in the theory of complex dynamical systems. They are also applied in organic computing as a recent research field of computer science. Therefore, cognitive science, AI, and robotics try to model the embodied mind in an artificial evolution. The paper analyzes these approaches in the interdisciplinary framework of complex dynamical systems and discusses their philosophical impact.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
NASA Astrophysics Data System (ADS)
Donnay, Karsten
2015-03-01
The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].
NASA Astrophysics Data System (ADS)
Pinnick, Cassandra L.
2008-11-01
This paper examines the relation between situated cognition theory in science education, and feminist standpoint theory in philosophy of science. It shows that situated cognition is an idea borrowed from a long since discredited philosophy of science. It argues that feminist standpoint theory ought not be indulged as it is a failed challenge to traditional philosophy of science. Standpoint theory diverts attention away from the abiding educational and career needs of women in science. In the interest of women in science, and in the interest of science, science educators would do best for their constituencies by a return to feminist philosophy understood as the demand for equal access and a level playing field for women in science and society.
Sequential visibility-graph motifs
NASA Astrophysics Data System (ADS)
Iacovacci, Jacopo; Lacasa, Lucas
2016-04-01
Visibility algorithms transform time series into graphs and encode dynamical information in their topology, paving the way for graph-theoretical time series analysis as well as building a bridge between nonlinear dynamics and network science. In this work we introduce and study the concept of sequential visibility-graph motifs, smaller substructures of n consecutive nodes that appear with characteristic frequencies. We develop a theory to compute in an exact way the motif profiles associated with general classes of deterministic and stochastic dynamics. We find that this simple property is indeed a highly informative and computationally efficient feature capable of distinguishing among different dynamics and robust against noise contamination. We finally confirm that it can be used in practice to perform unsupervised learning, by extracting motif profiles from experimental heart-rate series and being able, accordingly, to disentangle meditative from other relaxation states. Applications of this general theory include the automatic classification and description of physical, biological, and financial time series.
NASA Astrophysics Data System (ADS)
Bagdonas, Alexandre; Silva, Cibelle Celestino
2015-11-01
Educators advocate that science education can help the development of more responsible worldviews when students learn not only scientific concepts, but also about science, or "nature of science". Cosmology can help the formation of worldviews because this topic is embedded in socio-cultural and religious issues. Indeed, during the Cold War period, the cosmological controversy between Big Bang and Steady State theory was tied up with political and religious arguments. The present paper discusses a didactic sequence developed for and applied in a pre-service science teacher-training course on history of science. After studying the historical case, pre-service science teachers discussed how to deal with possible conflicts between scientific views and students' personal worldviews related to religion. The course focused on the study of primary and secondary sources about cosmology and religion written by cosmologists such as Georges Lemaître, Fred Hoyle and the Pope Pius XII. We used didactic strategies such as short seminars given by groups of pre-service teachers, videos, computer simulations, role-play, debates and preparation of written essays. Along the course, most pre-service teachers emphasized differences between science and religion and pointed out that they do not feel prepared to conduct classroom discussions about this topic. Discussing the relations between science and religion using the history of cosmology turned into an effective way to teach not only science concepts but also to stimulate reflections about nature of science. This topic may contribute to increasing students' critical stance on controversial issues, without the need to explicitly defend certain positions, or disapprove students' cultural traditions. Moreover, pre-service teachers practiced didactic strategies to deal with this kind of unusual content.
Towards An Integrative Theory Of Consciousness: Part 2 (An Anthology Of Various Other Models)
De Sousa, Avinash
2013-01-01
The study of consciousness has today moved beyond neurobiology and cognitive models. In the past few years, there has been a surge of research into various newer areas. The present article looks at the non-neurobiological and non-cognitive theories regarding this complex phenomenon, especially ones that self-psychology, self-theory, artificial intelligence, quantum physics, visual cognitive science and philosophy have to offer. Self-psychology has proposed the need to understand the self and its development, and the ramifications of the self for morality and empathy, which will help us understand consciousness better. There have been inroads made from the fields of computer science, machine technology and artificial intelligence, including robotics, into understanding the consciousness of these machines and their implications for human consciousness. These areas are explored. Visual cortex and emotional theories along with their implications are discussed. The phylogeny and evolution of the phenomenon of consciousness is also highlighted, with theories on the emergence of consciousness in fetal and neonatal life. Quantum physics and its insights into the mind, along with the implications of consciousness and physics and their interface are debated. The role of neurophilosophy to understand human consciousness, the functions of such a concept, embodiment, the dark side of consciousness, future research needs and limitations of a scientific theory of consciousness complete the review. The importance and salient features of each theory are discussed along with certain pitfalls, if present. A need for the integration of various theories to understand consciousness from a holistic perspective is stressed. PMID:23678242
Towards an integrative theory of consciousness: part 2 (an anthology of various other models).
De Sousa, Avinash
2013-01-01
The study of consciousness has today moved beyond neurobiology and cognitive models. In the past few years, there has been a surge of research into various newer areas. The present article looks at the non-neurobiological and non-cognitive theories regarding this complex phenomenon, especially ones that self-psychology, self-theory, artificial intelligence, quantum physics, visual cognitive science and philosophy have to offer. Self-psychology has proposed the need to understand the self and its development, and the ramifications of the self for morality and empathy, which will help us understand consciousness better. There have been inroads made from the fields of computer science, machine technology and artificial intelligence, including robotics, into understanding the consciousness of these machines and their implications for human consciousness. These areas are explored. Visual cortex and emotional theories along with their implications are discussed. The phylogeny and evolution of the phenomenon of consciousness is also highlighted, with theories on the emergence of consciousness in fetal and neonatal life. Quantum physics and its insights into the mind, along with the implications of consciousness and physics and their interface are debated. The role of neurophilosophy to understand human consciousness, the functions of such a concept, embodiment, the dark side of consciousness, future research needs and limitations of a scientific theory of consciousness complete the review. The importance and salient features of each theory are discussed along with certain pitfalls, if present. A need for the integration of various theories to understand consciousness from a holistic perspective is stressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ginovska-Pangovska, Bojana; Autrey, Thomas; Parab, Kshitij K.
We report on a combined computational and experimental study of the activation of hydrogen using for 2,6-lutidine (Lut)/BCl3 Lewis pairs. Herein we describe the synthetic approach used to obtain a new FLP, Lut-BCl3 that activates molecular H2 at ~10 bar, 100 °C in toluene or lutidine as the solvent. The resulting compound is an unexpected neutral hydride, LutBHCl2, rather than the ion pair, which we attribute to ligand redistribution. The mechanism for activation was modeled with density functional theory and accurate G3(MP2)B3 theory. The dative bond in Lut-BCl3 is calculated to have a bond enthalpy of 15 kcal/mol. The separatedmore » pair is calculated to react with H2 and form the [LutH+][HBCl3–] ion pair with a barrier of 13 kcal/mol. Metathesis with LutBCl3 produces LutBHCl2 and [LutH][BCl4]. The overall reaction is exothermic by 8.5 kcal/mol. An alternative pathway was explored involving lutidine–borenium cation pair activating H2. This work was supported by the U.S. Department of Energy's (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Biosciences, and Geosciences, and was performed in part using the Molecular Science Computing Facility (MSCF) in the William R. Wiley Environmental Molecular Sciences Laboratory, a DOE national scientific user facility sponsored by the Department of Energy's Office of Biological and Environmental Research and located at the Pacific Northwest National Laboratory (PNNL). PNNL is operated by Battelle for DOE.« less
The fourth International Conference on Information Science and Cloud Computing
NASA Astrophysics Data System (ADS)
This book comprises the papers accepted by the fourth International Conference on Information Science and Cloud Computing (ISCC), which was held from 18-19 December, 2015 in Guangzhou, China. It has 70 papers divided into four parts. The first part focuses on Information Theory with 20 papers; the second part emphasizes Machine Learning also containing 21 papers; in the third part, there are 21 papers as well in the area of Control Science; and the last part with 8 papers is dedicated to Cloud Science. Each part can be used as an excellent reference by engineers, researchers and students who need to build a knowledge base of the most current advances and state-of-practice in the topics covered by the ISCC conference. Special thanks go to Professor Deyu Qi, General Chair of ISCC 2015, for his leadership in supervising the organization of the entire conference; Professor Tinghuai Ma, Program Chair, and members of program committee for evaluating all the submissions and ensuring the selection of only the highest quality papers; and the authors for sharing their ideas, results and insights. We sincerely hope that you enjoy reading papers included in this book.
Embodiment and Human Development.
Marshall, Peter J
2016-12-01
We are recognizing increasingly that the study of cognitive, social, and emotional processes must account for their embodiment in living, acting beings. The related field of embodied cognition (EC) has coalesced around dissatisfaction with the lack of attention to the body in cognitive science. For developmental scientists, the emphasis in the literature on adult EC on the role of the body in cognition may not seem particularly novel, given that bodily action was central to Piaget's theory of cognitive development. However, as the influence of the Piagetian account waned, developmental notions of embodiment were shelved in favor of mechanical computational approaches. In this article, I argue that by reconsidering embodiment, we can address a key issue with computational accounts: how meaning is constructed by the developing person. I also suggest that the process-relational approach to developmental systems can provide a system of concepts for framing a fully embodied, integrative developmental science.
Embodiment and Human Development
Marshall, Peter J.
2016-01-01
We are recognizing increasingly that the study of cognitive, social, and emotional processes must account for their embodiment in living, acting beings. The related field of embodied cognition (EC) has coalesced around dissatisfaction with the lack of attention to the body in cognitive science. For developmental scientists, the emphasis in the literature on adult EC on the role of the body in cognition may not seem particularly novel, given that bodily action was central to Piaget’s theory of cognitive development. However, as the influence of the Piagetian account waned, developmental notions of embodiment were shelved in favor of mechanical computational approaches. In this article, I argue that by reconsidering embodiment, we can address a key issue with computational accounts: how meaning is constructed by the developing person. I also suggest that the process-relational approach to developmental systems can provide a system of concepts for framing a fully embodied, integrative developmental science. PMID:27833651
Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA
Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.
2017-01-01
The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099
A survey of visual preprocessing and shape representation techniques
NASA Technical Reports Server (NTRS)
Olshausen, Bruno A.
1988-01-01
Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).
The theory of constructed emotion: an active inference account of interoception and categorization
2017-01-01
Abstract The science of emotion has been using folk psychology categories derived from philosophy to search for the brain basis of emotion. The last two decades of neuroscience research have brought us to the brink of a paradigm shift in understanding the workings of the brain, however, setting the stage to revolutionize our understanding of what emotions are and how they work. In this article, we begin with the structure and function of the brain, and from there deduce what the biological basis of emotions might be. The answer is a brain-based, computational account called the theory of constructed emotion. PMID:27798257
Multi-scale and Multi-physics Numerical Methods for Modeling Transport in Mesoscopic Systems
2014-10-13
function and wide band Fast multipole methods for Hankel waves. (2) a new linear scaling discontinuous Galerkin density functional theory, which provide a...inflow boundary condition for Wigner quantum transport equations. Also, a book titled "Computational Methods for Electromagnetic Phenomena...equationsin layered media with FMM for Bessel functions , Science China Mathematics, (12 2013): 2561. doi: TOTAL: 6 Number of Papers published in peer
Computational Science: Ensuring America’s Competitiveness
2005-06-01
Supercharging U. S. Innovation & Competitiveness, Washington, D.C. , July 2004. Davies, C. T. H. , et al. , “High-Precision Lattice QCD Confronts Experiment...together to form a class of particles call hadrons (that include protons and neutrons) . For 30 years, researchers in lattice QCD have been trying to use...the basic QCD equations to calculate the properties of hadrons, especially their masses, using numerical lattice gauge theory calculations in order to
NASA Astrophysics Data System (ADS)
Gutowitz, Howard
1991-08-01
Cellular automata, dynamic systems in which space and time are discrete, are yielding interesting applications in both the physical and natural sciences. The thirty four contributions in this book cover many aspects of contemporary studies on cellular automata and include reviews, research reports, and guides to recent literature and available software. Chapters cover mathematical analysis, the structure of the space of cellular automata, learning rules with specified properties: cellular automata in biology, physics, chemistry, and computation theory; and generalizations of cellular automata in neural nets, Boolean nets, and coupled map lattices. Current work on cellular automata may be viewed as revolving around two central and closely related problems: the forward problem and the inverse problem. The forward problem concerns the description of properties of given cellular automata. Properties considered include reversibility, invariants, criticality, fractal dimension, and computational power. The role of cellular automata in computation theory is seen as a particularly exciting venue for exploring parallel computers as theoretical and practical tools in mathematical physics. The inverse problem, an area of study gaining prominence particularly in the natural sciences, involves designing rules that possess specified properties or perform specified task. A long-term goal is to develop a set of techniques that can find a rule or set of rules that can reproduce quantitative observations of a physical system. Studies of the inverse problem take up the organization and structure of the set of automata, in particular the parameterization of the space of cellular automata. Optimization and learning techniques, like the genetic algorithm and adaptive stochastic cellular automata are applied to find cellular automaton rules that model such physical phenomena as crystal growth or perform such adaptive-learning tasks as balancing an inverted pole. Howard Gutowitz is Collaborateur in the Service de Physique du Solide et Résonance Magnetique, Commissariat a I'Energie Atomique, Saclay, France.
PREFACE: Euro-TMCS I: Theory, Modelling and Computational Methods for Semiconductors
NASA Astrophysics Data System (ADS)
Gómez-Campos, F. M.; Rodríguez-Bolívar, S.; Tomić, S.
2015-05-01
The present issue contains a selection of the best contributed works presented at the first Euro-TMCS conference (Theory, Modelling and Computational Methods for Semiconductors, European Session). The conference was held at Faculty of Sciences, Universidad de Granada, Spain on 28st-30st January 2015. This conference is the first European edition of the TMCS conference series which started in 2008 at the University of Manchester and has always been held in the United Kingdom. Four previous conferences have been previously carried out (Manchester 2008, York 2010, Leeds 2012 and Salford 2014). Euro-TMCS is run for three days; the first one devoted to giving invited tutorials, aimed particularly at students, on recent development of theoretical methods. On this occasion the session was focused on the presentation of widely-used computational methods for the modelling of physical processes in semiconductor materials. Freely available simulation software (SIESTA, Quantum Espresso and Yambo) as well as commercial software (TiberCad and MedeA) were presented in the conference by members of their development team, offering to the audience an overview of their capabilities for research. The second part of the conference showcased prestigious invited and contributed oral presentations, alongside poster sessions, in which direct discussion with authors was promoted. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in semiconductor science and technology. Theoretical approaches represented in this meeting included: Density Functional Theory, Semi-empirical Electronic Structure Methods, Multi-scale Approaches, Modelling of PV devices, Electron Transport, and Graphene. Topics included, but were not limited to: Optical Properties of Quantum Nanostructures including Colloids and Nanotubes, Plasmonics, Magnetic Semiconductors, Photonic Structures, and Electronic Devices. The Editors Acknowledgments: We would like to thank all participants for making this a very successful meeting and for their contribution to the conference programme and these proceedings. We would also like to acknowledge the financial support from Universidad de Granada, the CECAM UK-Hartree Node, project TEC2013-47283-R of Ministerio de Economía y Competitividad, and the company Materials Design (distributors of MedeA Software). Conference Organising Committee: Francisco M. Gómez-Campos (Co-chair, Universidad de Granada) Salvador Rodríguez-Bolívar (Co-chair, Universidad de Granada) Stanko Tomić (Co-chair, University of Salford)
ERIC Educational Resources Information Center
Pinnick, Cassandra L.
2008-01-01
This paper examines the relation between situated cognition theory in science education, and feminist standpoint theory in philosophy of science. It shows that situated cognition is an idea borrowed from a long since discredited philosophy of science. It argues that feminist standpoint theory ought not be indulged as it is a failed challenge to…
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach
Cheung, Mike W.-L.; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.
Cheung, Mike W-L; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.
Mechanisms and Dynamics of Abiotic and Biotic Interactions at Environmental Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roso, Kevin M.
The Stanford EMSI (SEMSI) was established in 2004 through joint funding by the National Science Foundation and the OBER-ERSD. It encompasses a number of universities and national laboratories. The PNNL component of the SEMSI is funded by ERSD and is the focus of this report. This component has the objective of providing theory support to the SEMSI by bringing computational capabilities and expertise to bear on important electron transfer problems at mineral/water and mineral/microbe interfaces. PNNL staff member Dr. Kevin Rosso, who is also ''matrixed'' into the Environmental Molecular Sciences Laboratory (EMSL) at PNNL, is a co-PI on the SEMSImore » project and the PNNL lead. The EMSL computational facilities being applied to the SEMSI project include the 11.8 teraflop massively-parallel supercomputer. Science goals of this EMSL/SEMSI partnership include advancing our understanding of: (1) The kinetics of U(VI) and Cr(VI) reduction by aqueous and solid-phase Fe(II), (2) The structure of mineral surfaces in equilibrium with solution, and (3) Mechanisms of bacterial electron transfer to iron oxide surfaces via outer-membrane cytochromes.« less
High-order hydrodynamic algorithms for exascale computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, Nathaniel Ray
Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broadmore » range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.« less
Markowitz, Dina G; DuPré, Michael J
2007-01-01
The University of Rochester's Graduate Experience in Science Education (GESE) course familiarizes biomedical science graduate students interested in pursuing academic career tracks with a fundamental understanding of some of the theory, principles, and concepts of science education. This one-semester elective course provides graduate students with practical teaching and communication skills to help them better relate science content to, and increase their confidence in, their own teaching abilities. The 2-h weekly sessions include an introduction to cognitive hierarchies, learning styles, and multiple intelligences; modeling and coaching some practical aspects of science education pedagogy; lesson-planning skills; an introduction to instructional methods such as case studies and problem-based learning; and use of computer-based instructional technologies. It is hoped that the early development of knowledge and skills about teaching and learning will encourage graduate students to continue their growth as educators throughout their careers. This article summarizes the GESE course and presents evidence on the effectiveness of this course in providing graduate students with information about teaching and learning that they will use throughout their careers.
DuPré, Michael J.
2007-01-01
The University of Rochester's Graduate Experience in Science Education (GESE) course familiarizes biomedical science graduate students interested in pursuing academic career tracks with a fundamental understanding of some of the theory, principles, and concepts of science education. This one-semester elective course provides graduate students with practical teaching and communication skills to help them better relate science content to, and increase their confidence in, their own teaching abilities. The 2-h weekly sessions include an introduction to cognitive hierarchies, learning styles, and multiple intelligences; modeling and coaching some practical aspects of science education pedagogy; lesson-planning skills; an introduction to instructional methods such as case studies and problem-based learning; and use of computer-based instructional technologies. It is hoped that the early development of knowledge and skills about teaching and learning will encourage graduate students to continue their growth as educators throughout their careers. This article summarizes the GESE course and presents evidence on the effectiveness of this course in providing graduate students with information about teaching and learning that they will use throughout their careers. PMID:17785406
NASA Astrophysics Data System (ADS)
Nakatsuji, Hiroshi
Chemistry is a science of complex subjects that occupy this universe and biological world and that are composed of atoms and molecules. Its essence is diversity. However, surprisingly, whole of this science is governed by simple quantum principles like the Schrödinger and the Dirac equations. Therefore, if we can find a useful general method of solving these quantum principles under the fermionic and/or bosonic constraints accurately in a reasonable speed, we can replace somewhat empirical methodologies of this science with purely quantum theoretical and computational logics. This is the purpose of our series of studies - called ``exact theory'' in our laboratory. Some of our documents are cited below. The key idea was expressed as the free complement (FC) theory (originally called ICI theory) that was introduced to solve the Schrödinger and Dirac equations analytically. For extending this methodology to larger systems, order N methodologies are essential, but actually the antisymmetry constraints for electronic wave functions become big constraints. Recently, we have shown that the antisymmetry rule or `dogma' can be very much relaxed when our subjects are large molecular systems. In this talk, I want to present our recent progress in our FC methodology. The purpose is to construct ``predictive quantum chemistry'' that is useful in chemical and physical researches and developments in institutes and industries
NASA Technical Reports Server (NTRS)
Fischer, James R.; Grosch, Chester; Mcanulty, Michael; Odonnell, John; Storey, Owen
1987-01-01
NASA's Office of Space Science and Applications (OSSA) gave a select group of scientists the opportunity to test and implement their computational algorithms on the Massively Parallel Processor (MPP) located at Goddard Space Flight Center, beginning in late 1985. One year later, the Working Group presented its report, which addressed the following: algorithms, programming languages, architecture, programming environments, the way theory relates, and performance measured. The findings point to a number of demonstrated computational techniques for which the MPP architecture is ideally suited. For example, besides executing much faster on the MPP than on conventional computers, systolic VLSI simulation (where distances are short), lattice simulation, neural network simulation, and image problems were found to be easier to program on the MPP's architecture than on a CYBER 205 or even a VAX. The report also makes technical recommendations covering all aspects of MPP use, and recommendations concerning the future of the MPP and machines based on similar architectures, expansion of the Working Group, and study of the role of future parallel processors for space station, EOS, and the Great Observatories era.
Thill, Serge; Padó, Sebastian; Ziemke, Tom
2014-07-01
The recent trend in cognitive robotics experiments on language learning, symbol grounding, and related issues necessarily entails a reduction of sensorimotor aspects from those provided by a human body to those that can be realized in machines, limiting robotic models of symbol grounding in this respect. Here, we argue that there is a need for modeling work in this domain to explicitly take into account the richer human embodiment even for concrete concepts that prima facie relate merely to simple actions, and illustrate this using distributional methods from computational linguistics which allow us to investigate grounding of concepts based on their actual usage. We also argue that these techniques have applications in theories and models of grounding, particularly in machine implementations thereof. Similarly, considering the grounding of concepts in human terms may be of benefit to future work in computational linguistics, in particular in going beyond "grounding" concepts in the textual modality alone. Overall, we highlight the overall potential for a mutually beneficial relationship between the two fields. Copyright © 2014 Cognitive Science Society, Inc.
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
Legislator voting and behavioral science theory: a systematic review.
Tung, Gregory J; Vernick, Jon S; Reiney, Erin V; Gielen, Andrea C
2012-11-01
To examine the application of behavioral science theories to explain the voting behavior of legislators for public health policies. We conducted a systematic review to identify studies that examined factors associated with legislator support, intention to vote, or actual votes on public health policies, emphasizing those grounded in behavior science theory. Twenty-one papers met our inclusion criteria, and 6 were explicitly grounded in a behavioral science theory. Behavioral science theories, and the theory of planned behavior in particular, provide a framework for understanding legislator voting behavior and can be used by advocates to advance pro-health policies.
NASA Astrophysics Data System (ADS)
Douglas, Jack
2014-03-01
One of the things that puzzled me when I was a PhD student working under Karl Freed was the curious unity between the theoretical descriptions of excluded volume interactions in polymers, the hydrodynamic properties of polymers in solution, and the critical properties of fluid mixtures, gases and diverse other materials (magnets, superfluids,etc.) when these problems were formally expressed in terms of Wiener path integration and the interactions treated through a combination of epsilon expansion and renormalization group (RG) theory. It seemed that only the interaction labels changed from one problem to the other. What do these problems have in common? Essential clues to these interrelations became apparent when Karl Freed, myself and Shi-Qing Wang together began to study polymers interacting with hyper-surfaces of continuously variable dimension where the Feynman perturbation expansions could be performed through infinite order so that we could really understand what the RG theory was doing. It is evidently simply a particular method for resuming perturbation theory, and former ambiguities no longer existed. An integral equation extension of this type of exact calculation to ``surfaces'' of arbitrary fixed shape finally revealed the central mathematical object that links these diverse physical models- the capacity of polymer chains, whose value vanishes at the critical dimension of 4 and whose magnitude is linked to the friction coefficient of polymer chains, the virial coefficient of polymers and the 4-point function of the phi-4 field theory,...Once this central object was recognized, it then became possible solve diverse problems in material science through the calculation of capacity, and related ``virials'' properties, through Monte Carlo sampling of random walk paths. The essential ideas of this computational method are discussed and some applications given to non-trivial problems: nanotubes treated as either rigid rods or ensembles worm-like chains having finite cross-section, DNA, nanoparticles with grafted chain layers and knotted polymers. The path-integration method, which grew up from research in Karl Freed's group, is evidently a powerful tool for computing basic transport properties of complex-shaped objects and should find increasing application in polymer science, nanotechnological applications and biology.
Big Data Ecosystems Enable Scientific Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Critchlow, Terence J.; Kleese van Dam, Kerstin
Over the past 5 years, advances in experimental, sensor and computational technologies have driven the exponential growth in the volumes, acquisition rates, variety and complexity of scientific data. As noted by Hey et al in their 2009 e-book The Fourth Paradigm, this availability of large-quantities of scientifically meaningful data has given rise to a new scientific methodology - data intensive science. Data intensive science is the ability to formulate and evaluate hypotheses using data and analysis to extend, complement and, at times, replace experimentation, theory, or simulation. This new approach to science no longer requires scientists to interact directly withmore » the objects of their research; instead they can utilize digitally captured, reduced, calibrated, analyzed, synthesized and visualized results - allowing them carry out 'experiments' in data.« less
JDFTx: Software for joint density-functional theory
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...
2017-11-14
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
JDFTx: Software for joint density-functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
NASA Astrophysics Data System (ADS)
Celedón-Pattichis, Sylvia; LópezLeiva, Carlos Alfonso; Pattichis, Marios S.; Llamocca, Daniel
2013-12-01
There is a strong need in the United States to increase the number of students from underrepresented groups who pursue careers in Science, Technology, Engineering, and Mathematics. Drawing from sociocultural theory, we present approaches to establishing collaborations between computer engineering and mathematics/bilingual education faculty to address this need. We describe our work through the Advancing Out-of-School Learning in Mathematics and Engineering project by illustrating how an integrated curriculum that is based on mathematics with applications in image and video processing can be designed and how it can be implemented with middle school students from underrepresented groups.
New Frontiers in Language Evolution and Development.
Oller, D Kimbrough; Dale, Rick; Griebel, Ulrike
2016-04-01
This article introduces the Special Issue and its focus on research in language evolution with emphasis on theory as well as computational and robotic modeling. A key theme is based on the growth of evolutionary developmental biology or evo-devo. The Special Issue consists of 13 articles organized in two sections: A) Theoretical foundations and B) Modeling and simulation studies. All the papers are interdisciplinary in nature, encompassing work in biological and linguistic foundations for the study of language evolution as well as a variety of computational and robotic modeling efforts shedding light on how language may be developed and may have evolved. Copyright © 2016 Cognitive Science Society, Inc.
Majda, Andrew J; Abramov, Rafail; Gershgorin, Boris
2010-01-12
Climate change science focuses on predicting the coarse-grained, planetary-scale, longtime changes in the climate system due to either changes in external forcing or internal variability, such as the impact of increased carbon dioxide. The predictions of climate change science are carried out through comprehensive, computational atmospheric, and oceanic simulation models, which necessarily parameterize physical features such as clouds, sea ice cover, etc. Recently, it has been suggested that there is irreducible imprecision in such climate models that manifests itself as structural instability in climate statistics and which can significantly hamper the skill of computer models for climate change. A systematic approach to deal with this irreducible imprecision is advocated through algorithms based on the Fluctuation Dissipation Theorem (FDT). There are important practical and computational advantages for climate change science when a skillful FDT algorithm is established. The FDT response operator can be utilized directly for multiple climate change scenarios, multiple changes in forcing, and other parameters, such as damping and inverse modelling directly without the need of running the complex climate model in each individual case. The high skill of FDT in predicting climate change, despite structural instability, is developed in an unambiguous fashion using mathematical theory as guidelines in three different test models: a generic class of analytical models mimicking the dynamical core of the computer climate models, reduced stochastic models for low-frequency variability, and models with a significant new type of irreducible imprecision involving many fast, unstable modes.
Algorithmics - Is There Hope for a Unified Theory?
NASA Astrophysics Data System (ADS)
Hromkovič, Juraj
Computer science was born with the formal definition of the notion of an algorithm. This definition provides clear limits of automatization, separating problems into algorithmically solvable problems and algorithmically unsolvable ones. The second big bang of computer science was the development of the concept of computational complexity. People recognized that problems that do not admit efficient algorithms are not solvable in practice. The search for a reasonable, clear and robust definition of the class of practically solvable algorithmic tasks started with the notion of the class {P} and of {NP}-completeness. In spite of the fact that this robust concept is still fundamental for judging the hardness of computational problems, a variety of approaches was developed for solving instances of {NP}-hard problems in many applications. Our 40-years short attempt to fix the fuzzy border between the practically solvable problems and the practically unsolvable ones partially reminds of the never-ending search for the definition of "life" in biology or for the definitions of matter and energy in physics. Can the search for the formal notion of "practical solvability" also become a never-ending story or is there hope for getting a well-accepted, robust definition of it? Hopefully, it is not surprising that we are not able to answer this question in this invited talk. But to deal with this question is of crucial importance, because only due to enormous effort scientists get a better and better feeling of what the fundamental notions of science like life and energy mean. In the flow of numerous technical results, we must not forget the fact that most of the essential revolutionary contributions to science were done by defining new concepts and notions.
The Fixed-Point Theory of Strictly Causal Functions
2013-06-09
functions were defined to be the functions that are strictly contracting with respect to the Cantor metric (also called the Baire distance) on signals...of Lecture Notes in Computer Science, pages 447–484. Springer Berlin / Heidelberg, 1992. [36] George Markowsky. Chain-complete posets and directed...Journal of Logic Programming, 42(2):59–70, 2000. [52] George M. Reed and A. William Roscoe. A timed model for communicating sequential processes. In Laurent
ERIC Educational Resources Information Center
Simonson, Michael R., Ed.; Frey, Diane, Ed.
1989-01-01
The 46 papers is this volume represent some of the most current thinking in educational communications and technology. Individual papers address the following topics: gender differences in the selection of elective computer science courses and in the selection of non-traditional careers; instruction for individuals with different cognitive styles;…
Negotiating the Traffic: Can Cognitive Science Help Make Autonomous Vehicles a Reality?
Chater, Nick; Misyak, Jennifer; Watson, Derrick; Griffiths, Nathan; Mouzakitis, Alex
2018-02-01
To drive safely among human drivers, cyclists and pedestrians, autonomous vehicles will need to mimic, or ideally improve upon, humanlike driving. Yet, driving presents us with difficult problems of joint action: 'negotiating' with other users over shared road space. We argue that autonomous driving provides a test case for computational theories of social interaction, with fundamental implications for the development of autonomous vehicles. Copyright © 2017 Elsevier Ltd. All rights reserved.
An Online Algorithm for Maximizing Submodular Functions
2007-12-20
dynamics of the social network are known. In theory, our online algorithms could be used to adapt a marketing campaign to unknown or time-varying social...An Online Algorithm for Maximizing Submodular Functions Matthew Streeter Daniel Golovin December 20, 2007 CMU-CS-07-171 School of Computer Science...number. 1. REPORT DATE 20 DEC 2007 2. REPORT TYPE 3. DATES COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE An Online Algorithm for
Connecting Theory and Applications Across Complex Systems
2004-01-01
applications in biology and computer science. Fernando Pacanini received his Ingeniero Electricista and Licenciado en Matematica degrees from the...Carlson (pdf) (Pd ) breaks 3:30-4:00 10:30-11:00 3:30-4:00 10:30-11:00 El -Samad / Biology Arkin (ndf) Savageau (ndf) Mitra (tdf) Khammash (Pdf...including the Internet and forest ecology. Hana El -Samad is a PhD candidate at the Mechanical Engineering department of the University of California at
Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling
NASA Astrophysics Data System (ADS)
Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.
2014-12-01
Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.
The Structure of Medical Informatics Journal Literature
Morris, Theodore A.; McCain, Katherine W.
1998-01-01
Abstract Objective: Medical informatics is an emergent interdisciplinary field described as drawing upon and contributing to both the health sciences and information sciences. The authors elucidate the disciplinary nature and internal structure of the field. Design: To better understand the field's disciplinary nature, the authors examine the intercitation relationships of its journal literature. To determine its internal structure, they examined its journal cocitation patterns. Measurements: The authors used data from the Science Citation Index (SCI) and Social Science Citation Index (SSCI) to perform intercitation studies among productive journal titles, and software routines from SPSS to perform multivariate data analyses on cocitation data for proposed core journals. Results: Intercitation network analysis suggests that a core literature exists, one mark of a separate discipline. Multivariate analyses of cocitation data suggest that major focus areas within the field include biomedical engineering, biomedical computing, decision support, and education. The interpretable dimensions of multidimensional scaling maps differed for the SCI and SSCI data sets. Strong links to information science literature were not found. Conclusion: The authors saw indications of a core literature and of several major research fronts. The field appears to be viewed differently by authors writing in journals indexed by SCI from those writing in journals indexed by SSCI, with more emphasis placed on computers and engineering versus decision making by the former and more emphasis on theory versus application (clinical practice) by the latter. PMID:9760393
The Scientific Theory Profile: A Philosophy of Science Model for Science Teachers.
ERIC Educational Resources Information Center
Loving, Cathleen
The model developed for use with science teachers--called the Scientific Theory Profile--consists of placing three well-known philosophers of science on a grid, with the x-axis being their methods for judging theories (rational vs. natural) and the y-axis being their views on scientific theories representing the Truth versus mere models of what…
NASA Astrophysics Data System (ADS)
Gafurov, O.; Gafurov, D.; Syryamkin, V.
2018-05-01
The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.
Optimized Materials From First Principles Simulations: Are We There Yet?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galli, G; Gygi, F
2005-07-26
In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less
Unification of field theory and maximum entropy methods for learning probability densities
NASA Astrophysics Data System (ADS)
Kinney, Justin B.
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Unification of field theory and maximum entropy methods for learning probability densities.
Kinney, Justin B
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.
Johnson, Shane D; Groff, Elizabeth R
2014-07-01
The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.
Non-Kolmogorovian Approach to the Context-Dependent Systems Breaking the Classical Probability Law
NASA Astrophysics Data System (ADS)
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Yamato, Ichiro
2013-07-01
There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics.
How Singapore Junior College Science Teachers Address Curriculum Reforms: A Theory
ERIC Educational Resources Information Center
Lim, Patrick; Pyvis, David
2012-01-01
Using grounded theory research methodology, a theory was developed to explain how Singapore junior college science teachers implement educational reforms underpinning the key initiatives of the "Thinking Schools, Learning Nation" policy. The theory suggests Singapore junior college science teachers "deal with" implementing…
Statistical physics of hard combinatorial optimization: Vertex cover problem
NASA Astrophysics Data System (ADS)
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
Women's decision to major in STEM fields
NASA Astrophysics Data System (ADS)
Conklin, Stephanie
This paper explores the lived experiences of high school female students who choose to enter into STEM fields, and describes the influencing factors which steered these women towards majors in computer science, engineering and biology. Utilizing phenomenological methodology, this study seeks to understand the essence of women's decisions to enter into STEM fields and further describe how the decision-making process varies for women in high female enrollment fields, like biology, as compared with low enrollment fields like, computer science and engineering. Using Bloom's 3-Stage Theory, this study analyzes how relationships, experiences and barriers influenced women towards, and possibly away, from STEM fields. An analysis of women's experiences highlight that support of family, sustained experience in a STEM program during high school as well as the presence of an influential teacher were all salient factors in steering women towards STEM fields. Participants explained that influential teacher worked individually with them, modified and extended assignments and also steered participants towards coursework and experiences. This study also identifies factors, like guidance counselors as well as personal challenges, which inhibited participant's path to STEM fields. Further, through analyzing all six participants' experiences, it is clear that a linear model, like Bloom's 3-Stage Model, with limited ability to include potential barriers inhibited the ability to capture the essence of each participant's decision-making process. Therefore, a revised model with no linear progression which allows for emerging factors, like personal challenges, has been proposed; this model focuses on how interest in STEM fields begins to develop and is honed and then mastered. This study also sought to identify key differences in the paths of female students pursuing different majors. The findings of this study suggest that the path to computer science and engineering is limited. Computer science majors faced few, if any, challenges, hoped to use computers as a tool to innovate and also participated in the same computer science program. For female engineering students, the essence of their experience focused on interaction at a young age with an expert in an engineering-related field as well as a strong desire to help solve world problems using engineering. These participants were able to articulate clearly future careers. In contrast, biology majors, faced more challenges and were undecided about their future career goals. These results suggest that a longitudinal study focused on women pursuing engineering and computer science fields is warranted; this will hopefully allow these findings to be substantiated and also for refinement of the revised theoretical model.
Finite Dimensional Approximations for Continuum Multiscale Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berlyand, Leonid
2017-01-24
The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less
Chemistry in the News: 1998 Nobel Prizes in Chemistry and Medicine
NASA Astrophysics Data System (ADS)
Miller, Jennifer B.
1999-01-01
The Royal Swedish Academy of Sciences has awarded the 1998 Nobel Prize in Chemistry to Walter Kohn (University of California at Santa Barbara) for his development of the density-functional theory and to John A. Pople (Northwestern University at Evanston, Illinois) for his development of computational methods in quantum chemistry. The Nobel Assembly at the Karolinska Institute has awarded the 1998 Nobel Prize in Physiology or Medicine jointly to Robert F. Fuchgott (State University of New York Health Science Center at Brooklyn), Louis J. Ignarro (University of California at Los Angeles), and Ferid Murad (University of Texas Medical School at Houston) for identifying nitric oxide as a key biological signaling molecule in the cardiovascular system.
Cajal and consciousness. Introduction.
Marijuán, P C
2001-04-01
One hundred years after Santiago Ramón Cajal established the bases of modern neuroscience in his masterpiece Textura del sistema nervioso del hombre y de los vertebrados, the question is stated again: What is the status of consciousness today? The responses in this book, by contemporary leading figures of neuroscience, evolution, molecular biology, computer science, and quantum physics, collectively compose a fascinating conceptual landscape. Both the evolutionary emergence of consciousness and its development towards the highest level may be analyzed by a wealth of new theories and hypotheses, including Cajal's prescient ones. Some noticeable gaps remain, however. Celebrating the centennial of Textura is a timely occasion to reassess how close--and how far--our system of the sciences is to explaining consciousness.
An immersed boundary method for modeling a dirty geometry data
NASA Astrophysics Data System (ADS)
Onishi, Keiji; Tsubokura, Makoto
2017-11-01
We present a robust, fast, and low preparation cost immersed boundary method (IBM) for simulating an incompressible high Re flow around highly complex geometries. The method is achieved by the dispersion of the momentum by the axial linear projection and the approximate domain assumption satisfying the mass conservation around the wall including cells. This methodology has been verified against an analytical theory and wind tunnel experiment data. Next, we simulate the problem of flow around a rotating object and demonstrate the ability of this methodology to the moving geometry problem. This methodology provides the possibility as a method for obtaining a quick solution at a next large scale supercomputer. This research was supported by MEXT as ``Priority Issue on Post-K computer'' (Development of innovative design and production processes) and used computational resources of the K computer provided by the RIKEN Advanced Institute for Computational Science.
Computational predictions of the new Gallium nitride nanoporous structures
NASA Astrophysics Data System (ADS)
Lien, Le Thi Hong; Tuoc, Vu Ngoc; Duong, Do Thi; Thu Huyen, Nguyen
2018-05-01
Nanoporous structural prediction is emerging area of research because of their advantages for a wide range of materials science and technology applications in opto-electronics, environment, sensors, shape-selective and bio-catalysis, to name just a few. We propose a computationally and technically feasible approach for predicting Gallium nitride nanoporous structures with hollows at the nano scale. The designed porous structures are studied with computations using the density functional tight binding (DFTB) and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications. Their stability is discussed by means of the free energy computed within the lattice-dynamics approach. Our calculations also indicate that all the reported hollow structures are wide band gap semiconductors in the same fashion with their parent’s bulk stable phase. The electronic band structures of these nanoporous structures are finally examined in detail.
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Earth Sciences Push Radiative Transfer Theory
NASA Astrophysics Data System (ADS)
Davis, Anthony; Mishchenko, Michael
2009-12-01
2009 International Conference on Advances in Mathematics, Computational Methods, and Reactor Physics; Saratoga Springs, New York, 4-7 May 2009; The theories of radiative transfer and particle—particularly neutron—transport are grounded in distinctive microscale physics that deals with either optics or particle dynamics. However, it is not practical to track every wave or particle in macroscopic systems, nor do all of these details matter. That is why Newton's laws, which describe individual particles, are replaced by those of Euler, Navier-Stokes, Maxwell, Boltzmann, Gibbs, and others, which describe the collective behavior of vast numbers of particles. And that is why the radiative transfer (RT) equation is used to describe the flow of radiation through geophysical-scale systems, leaving to Maxwell's wave equations only the task of providing the optical properties of the medium, be it air, water, snow, ice, or biomass. Interestingly, particle transport is determined by the linear transport equation, which is mathematically identical to the RT equation, so geophysicists and nuclear scientists are interested in the same mathematics and computational techniques.
The scientific theory profile: A philosophy of science model for science teachers
NASA Astrophysics Data System (ADS)
Loving, Cathleen C.
A model called the Scientific Theory Profile was developed for use with preservice and inservice science teachers or with graduate students interested in the various ways scientific theories are perceived. Early indications - from a survey of institutions with science education programs and a survey of current science methods texts - are that too little emphasis is placed on what contemporary writings reveal about the nature and importance of scientific theories. This prompted the development of the Profile. The Profile consists of a grid, with the x-axis representing methods for judging theories (rational vs. natural), and the y-axis representing views on reigning scientific theories as being the Truth versus models of what works best (realism vs. anti-realism). Three well-known philosophers of science who were selected for detailed analysis and who form the keystone positions on the Profile are Thomas Kuhn, Carl Hempel, and Sir Karl Popper. The hypothesis was that an analysis of the writings of respected individuals in philosophy and history of science who have different perspectives on theories (as well as overarching areas of agreement) could be translated into relative coordinates on a graph; and that this visual model might be helpful to science teachers in developing a balanced philosophy of science and a deeper understanding of the power of reigning theories. Nine other contemporary philosophers, all influenced by the three originals, are included in brief analyses, with their positions on the grid being relative to the keystones. The Scientific Theory Profile then forms the basis for a course, now in the planning stages, in perspectives on the nature of science, primarily for science teachers, with some objectives and activities suggested.
The Relationship of Mentoring on Middle School Girls' Science-Related Attitudes
ERIC Educational Resources Information Center
Clark, Lynette M.
2013-01-01
This quantitative study examined the science-related attitudes of middle school girls who attended a science-focused mentoring program and those of middle school girls who attended a traditional mentoring program. Theories related to this study include social cognitive theory, cognitive development theory, and possible selves' theory. These…
Pareto Joint Inversion of Love and Quasi Rayleigh's waves - synthetic study
NASA Astrophysics Data System (ADS)
Bogacz, Adrian; Dalton, David; Danek, Tomasz; Miernik, Katarzyna; Slawinski, Michael A.
2017-04-01
In this contribution the specific application of Pareto joint inversion in solving geophysical problem is presented. Pareto criterion combine with Particle Swarm Optimization were used to solve geophysical inverse problems for Love and Quasi Rayleigh's waves. Basic theory of forward problem calculation for chosen surface waves is described. To avoid computational problems some simplification were made. This operation allowed foster and more straightforward calculation without lost of solution generality. According to the solving scheme restrictions, considered model must have exact two layers, elastic isotropic surface layer and elastic isotropic half space with infinite thickness. The aim of the inversion is to obain elastic parameters and model geometry using dispersion data. In calculations different case were considered, such as different number of modes for different wave types and different frequencies. Created solutions are using OpenMP standard for parallel computing, which help in reduction of computational times. The results of experimental computations are presented and commented. This research was performed in the context of The Geomechanics Project supported by Husky Energy. Also, this research was partially supported by the Natural Sciences and Engineering Research Council of Canada, grant 238416-2013, and by the Polish National Science Center under contract No. DEC-2013/11/B/ST10/0472.
Pseudorandom Number Generators for Mobile Devices: An Examination and Attempt to Improve Randomness
2013-09-01
Notes in Computer Science (LNCS), Vol. 4341), (Hanoi, Vietnam: Springer, 2006), 260–270. 36 Simon R. Blackburn , “The Linear Complexity of the Self... Blackburn , Simon R. ‘The Linear Complexity of the Self-Shrinking Generator.” IEEE Trans. Inf. Theory, 45 (September 1999). Blum, Leonore, Manuel...afloat when the waters have been rough! xv THIS PAGE INTENTIONALLY LEFT BLANK xvi I. INTRODUCTION When the average man thinks about war and
2015-12-01
combine satisficing behaviour with learning and adaptation through environmental feedback. This a sequential decision making with one alternative...next action that an opponent will most likely take in a strategic interaction. Also, cognitive models derived from instance- based learning theory (IBL... through instance- based learning . In Y. Li (Ed.), Lecture Notes in Computer Science (Vol. 6818, pp. 281-293). Heidelberg: Springer Berlin. Gonzalez, C
Georges Lemaître: The Priest Who Invented the Big Bang
NASA Astrophysics Data System (ADS)
Lambert, Dominique
This contribution gives a concise survey of Georges Lemaître works and life, shedding some light on less-known aspects. Lemaître is a Belgian catholic priest who gave for the first time in 1927 the explanation of the Hubble law and who proposed in 1931 the "Primeval Atom Hypothesis", considered as the first step towards the Big Bang cosmology. But the scientific work of Lemaître goes far beyond Physical Cosmology. Indeed, he contributed also to the theory of Cosmis Rays, to the Spinor theory, to Analytical mechanics (regularization of 3- Bodies problem), to Numerical Analysis (Fast Fourier Transform), to Computer Science (he introduced and programmed the first computer of Louvain),… Lemaître took part to the "Science and Faith" debate. He defended a position that has some analogy with the NOMA principle, making a sharp distinction between what he called the "two paths to Truth" (a scientific one and a theological one). In particular, he never made a confusion between the theological concept of "creation" and the scientific notion of "natural beginning" (initial singularity). Lemaître was deeply rooted in his faith and sacerdotal vocation. Remaining a secular priest, he belonged to a community of priests called "The Friends of Jesus", characterized by a deep spirituality and special vows (for example the vow of poverty). He had also an apostolic activity amongst Chinese students.
ERIC Educational Resources Information Center
Kolokouri, Eleni; Plakitsi, Katerina
2012-01-01
This study uses history of science in teaching natural sciences from the early grades. The theoretical framework used is Cultural Historical Activity Theory (CHAT), which is a theory with expanding applications in different fields of science. The didactical scenario, in which history of science is used in a CHAT context, refers to Newton's…
Allen Newell's Program of Research: The Video-Game Test.
Gobet, Fernand
2017-04-01
Newell (1973) argued that progress in psychology was slow because research focused on experiments trying to answer binary questions, such as serial versus parallel processing. In addition, not enough attention was paid to the strategies used by participants, and there was a lack of theories implemented as computer models offering sufficient precision for being tested rigorously. He proposed a three-headed research program: to develop computational models able to carry out the task they aimed to explain; to study one complex task in detail, such as chess; and to build computational models that can account for multiple tasks. This article assesses the extent to which the papers in this issue advance Newell's program. While half of the papers devote much attention to strategies, several papers still average across them, a capital sin according to Newell. The three courses of action he proposed were not popular in these papers: Only two papers used computational models, with no model being both able to carry out the task and to account for human data; there was no systematic analysis of a specific video game; and no paper proposed a computational model accounting for human data in several tasks. It is concluded that, while they use sophisticated methods of analysis and discuss interesting results, overall these papers contribute only little to Newell's program of research. In this respect, they reflect the current state of psychology and cognitive science. This is a shame, as Newell's ideas might help address the current crisis of lack of replication and fraud in psychology. Copyright © 2017 The Author. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
NASA Astrophysics Data System (ADS)
Lee, Myeong H.; Dunietz, Barry D.; Geva, Eitan
2014-03-01
Classical Marcus theory is commonly adopted in solvent-mediated charge transfer (CT) process to obtain the CT rate constant, but it can become questionable when the intramolecular vibrational modes dominate the CT process as in OPV devices because Marcus theory treats these modes classically and therefore nuclear tunneling is not accounted for. We present a computational scheme to obtain the electron transfer rate constant beyond classical Marcus theory. Within this approach, the nuclear vibrational modes are treated quantum-mechanically and a short-time approximation is avoided. Ab initio calculations are used to obtain the basic parameters needed for calculating the electron transfer rate constant. We apply our methodology to phthalocyanine(H2PC)-C60 organic photovoltaic system where one C60 acceptor and one or two H2PC donors are included to model the donor-acceptor interface configuration. We obtain the electron transfer and recombination rate constants for all accessible charge transfer (CT) states, from which the CT exciton dynamics is determined by employing a master equation. The role of higher lying excited states in CT exciton dynamics is discussed. This work is pursued as part of the Center for Solar and Thermal Energy Conversion, an Energy Frontier Research Center funded by the US Department of Energy Office of Science, Office of Basic Energy Sciences under 390 Award No. DE-SC0000957.
NASA Astrophysics Data System (ADS)
Barrow, John D.; Davies, Paul C. W.; Harper, Charles L., Jr.
2004-06-01
This preview of the future of physics comprises contributions from recognized authorities inspired by the pioneering work of John Wheeler. Quantum theory represents a unifying theme within the book, as it relates to the topics of the nature of physical reality, cosmic inflation, the arrow of time, models of the universe, superstrings, quantum gravity and cosmology. Attempts to formulate a final unification theory of physics are also considered, along with the existence of hidden dimensions of space, hidden cosmic matter, and the strange world of quantum technology. John Archibald Wheeler is one of the most influential scientists of the twentieth century. His extraordinary career has spanned momentous advances in physics, from the birth of the nuclear age to the conception of the quantum computer. Famous for coining the term "black hole," Professor Wheeler helped lay the foundations for the rebirth of gravitation as a mainstream branch of science, triggering the explosive growth in astrophysics and cosmology that followed. His early contributions to physics include the S matrix, the theory of nuclear rotation (with Edward Teller), the theory of nuclear fission (with Niels Bohr), action-at-a-distance electrodynamics (with Richard Feynman), positrons as backward-in-time electrons, the universal Fermi interaction (with Jayme Tiomno), muonic atoms, and the collective model of the nucleus. His inimitable style of thinking, quirky wit, and love of the bizarre have inspired generations of physicists.
The birth and evolution of surface science: child of the union of science and technology.
Duke, C B
2003-04-01
This article is an account of the birth and evolution of surface science as an interdisciplinary research area. Surface science emanated from the confluence of concepts and tools in physics and chemistry with technological innovations that made it possible to determine the structure and properties of surfaces and interfaces and the dynamics of chemical reactions at surfaces. The combination in the 1960s and 1970s of ultra-high-vacuum (i.e., P < 10(-7) Pascal or 10(-9) Torr) technology with the recognition that electrons in the energy range from 50 to 500 eV exhibited inelastic collision mean free paths of the order of a few angstroms fostered an explosion of activity. The results were a reformulation of the theory of electron solid scattering, the nearly universal use of electron spectroscopies for surface characterization, the rise of surface science as an independent interdisciplinary research area, and the emergence of the American Vacuum Society (AVS) as a major international scientific society. The rise of microelectronics in the 1970s and 1980s resulted in huge increases in computational power. These increases enabled more complex experiments and the utilization of density functional theory for the quantitative prediction of surface structure and dynamics. Development of scanning-probe microscopies in the 1990s led to atomic-resolution images of macroscopic surfaces and interfaces as well as videos of atoms moving about on surfaces during growth and diffusion. Scanning probes have since brought solid-liquid interfaces into the realm of atomic-level surface science, expanding its scope to more complex systems, including fragile biological materials and processes.
Grounded understanding of abstract concepts: The case of STEM learning.
Hayes, Justin C; Kraemer, David J M
2017-01-01
Characterizing the neural implementation of abstract conceptual representations has long been a contentious topic in cognitive science. At the heart of the debate is whether the "sensorimotor" machinery of the brain plays a central role in representing concepts, or whether the involvement of these perceptual and motor regions is merely peripheral or epiphenomenal. The domain of science, technology, engineering, and mathematics (STEM) learning provides an important proving ground for sensorimotor (or grounded) theories of cognition, as concepts in science and engineering courses are often taught through laboratory-based and other hands-on methodologies. In this review of the literature, we examine evidence suggesting that sensorimotor processes strengthen learning associated with the abstract concepts central to STEM pedagogy. After considering how contemporary theories have defined abstraction in the context of semantic knowledge, we propose our own explanation for how body-centered information, as computed in sensorimotor brain regions and visuomotor association cortex, can form a useful foundation upon which to build an understanding of abstract scientific concepts, such as mechanical force. Drawing from theories in cognitive neuroscience, we then explore models elucidating the neural mechanisms involved in grounding intangible concepts, including Hebbian learning, predictive coding, and neuronal recycling. Empirical data on STEM learning through hands-on instruction are considered in light of these neural models. We conclude the review by proposing three distinct ways in which the field of cognitive neuroscience can contribute to STEM learning by bolstering our understanding of how the brain instantiates abstract concepts in an embodied fashion.
NASA Astrophysics Data System (ADS)
Farhangi, Sanaz
2012-12-01
This paper presents a review of Jane McGonigal's book, "Reality is broken" (Reality is broken: why games make us better and how they can change the world. Penguin Press, New York, 2011). As the book subtitle suggests it is a book about "why games make us better and how they can change the world", written by a specialist in computer game design. I will try to show the relevance this book might have to science educators through emphasizing the points that the author offers as the fixes to rebuild reality on the image of gaming world. Using cultural-historical activity theory, I will explore how taking up a gamer mindset can challenge one to consider shortcomings in current approaches to the activity of teaching-learning science and how using this mindset can open our minds to think of new ways of engaging in the activity of doing science. I hope this review will encourage educators to explore the worldview presented in the book and use it to transform our thinking about science education.
Scalable real space pseudopotential density functional codes for materials in the exascale regime
NASA Astrophysics Data System (ADS)
Lena, Charles; Chelikowsky, James; Schofield, Grady; Biller, Ariel; Kronik, Leeor; Saad, Yousef; Deslippe, Jack
Real-space pseudopotential density functional theory has proven to be an efficient method for computing the properties of matter in many different states and geometries, including liquids, wires, slabs, and clusters with and without spin polarization. Fully self-consistent solutions using this approach have been routinely obtained for systems with thousands of atoms. Yet, there are many systems of notable larger sizes where quantum mechanical accuracy is desired, but scalability proves to be a hindrance. Such systems include large biological molecules, complex nanostructures, or mismatched interfaces. We will present an overview of our new massively parallel algorithms, which offer improved scalability in preparation for exascale supercomputing. We will illustrate these algorithms by considering the electronic structure of a Si nanocrystal exceeding 104 atoms. Support provided by the SciDAC program, Department of Energy, Office of Science, Advanced Scientific Computing Research and Basic Energy Sciences. Grant Numbers DE-SC0008877 (Austin) and DE-FG02-12ER4 (Berkeley).
Exploring Human Cognition Using Large Image Databases.
Griffiths, Thomas L; Abbott, Joshua T; Hsu, Anne S
2016-07-01
Most cognitive psychology experiments evaluate models of human cognition using a relatively small, well-controlled set of stimuli. This approach stands in contrast to current work in neuroscience, perception, and computer vision, which have begun to focus on using large databases of natural images. We argue that natural images provide a powerful tool for characterizing the statistical environment in which people operate, for better evaluating psychological theories, and for bringing the insights of cognitive science closer to real applications. We discuss how some of the challenges of using natural images as stimuli in experiments can be addressed through increased sample sizes, using representations from computer vision, and developing new experimental methods. Finally, we illustrate these points by summarizing recent work using large image databases to explore questions about human cognition in four different domains: modeling subjective randomness, defining a quantitative measure of representativeness, identifying prior knowledge used in word learning, and determining the structure of natural categories. Copyright © 2016 Cognitive Science Society, Inc.
A symbiotic approach to fluid equations and non-linear flux-driven simulations of plasma dynamics
NASA Astrophysics Data System (ADS)
Halpern, Federico
2017-10-01
The fluid framework is ubiquitous in studies of plasma transport and stability. Typical forms of the fluid equations are motivated by analytical work dating several decades ago, before computer simulations were indispensable, and can be, therefore, not optimal for numerical computation. We demonstrate a new first-principles approach to obtaining manifestly consistent, skew-symmetric fluid models, ensuring internal consistency and conservation properties even in discrete form. Mass, kinetic, and internal energy become quadratic (and always positive) invariants of the system. The model lends itself to a robust, straightforward discretization scheme with inherent non-linear stability. A simpler, drift-ordered form of the equations is obtained, and first results of their numerical implementation as a binary framework for bulk-fluid global plasma simulations are demonstrated. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, Theory Program, under Award No. DE-FG02-95ER54309.
NASA Astrophysics Data System (ADS)
D'Agostino, Gregorio; De Nicola, Antonio
2016-10-01
Exploiting the information about members of a Social Network (SN) represents one of the most attractive and dwelling subjects for both academic and applied scientists. The community of Complexity Science and especially those researchers working on multiplex social systems are devoting increasing efforts to outline general laws, models, and theories, to the purpose of predicting emergent phenomena in SN's (e.g. success of a product). On the other side the semantic web community aims at engineering a new generation of advanced services tailored to specific people needs. This implies defining constructs, models and methods for handling the semantic layer of SNs. We combined models and techniques from both the former fields to provide a hybrid approach to understand a basic (yet complex) phenomenon: the propagation of individual interests along the social networks. Since information may move along different social networks, one should take into account a multiplex structure. Therefore we introduced the notion of "Semantic Multiplex". In this paper we analyse two different semantic social networks represented by authors publishing in the Computer Science and those in the American Physical Society Journals. The comparison allows to outline common and specific features.
Understanding human visual systems and its impact on our intelligent instruments
NASA Astrophysics Data System (ADS)
Strojnik Scholl, Marija; Páez, Gonzalo; Scholl, Michelle K.
2013-09-01
We review the evolution of machine vision and comment on the cross-fertilization from the neural sciences onto flourishing fields of neural processing, parallel processing, and associative memory in optical sciences and computing. Then we examine how the intensive efforts in mapping the human brain have been influenced by concepts in computer sciences, control theory, and electronic circuits. We discuss two neural paths that employ the input from the vision sense to determine the navigational options and object recognition. They are ventral temporal pathway for object recognition (what?) and dorsal parietal pathway for navigation (where?), respectively. We describe the reflexive and conscious decision centers in cerebral cortex involved with visual attention and gaze control. Interestingly, these require return path though the midbrain for ocular muscle control. We find that the cognitive psychologists currently study human brain employing low-spatial-resolution fMRI with temporal response on the order of a second. In recent years, the life scientists have concentrated on insect brains to study neural processes. We discuss how reflexive and conscious gaze-control decisions are made in the frontal eye field and inferior parietal lobe, constituting the fronto-parietal attention network. We note that ethical and experiential learnings impact our conscious decisions.
Borycki, E M; Kushniruk, A W; Bellwood, P; Brender, J
2012-01-01
The objective of this paper is to examine the extent, range and scope to which frameworks, models and theories dealing with technology-induced error have arisen in the biomedical and life sciences literature as indexed by Medline®. To better understand the state of work in the area of technology-induced error involving frameworks, models and theories, the authors conducted a search of Medline® using selected key words identified from seminal articles in this research area. Articles were reviewed and those pertaining to frameworks, models or theories dealing with technology-induced error were further reviewed by two researchers. All articles from Medline® from its inception to April of 2011 were searched using the above outlined strategy. 239 citations were returned. Each of the abstracts for the 239 citations were reviewed by two researchers. Eleven articles met the criteria based on abstract review. These 11 articles were downloaded for further in-depth review. The majority of the articles obtained describe frameworks and models with reference to theories developed in other literatures outside of healthcare. The papers were grouped into several areas. It was found that articles drew mainly from three literatures: 1) the human factors literature (including human-computer interaction and cognition), 2) the organizational behavior/sociotechnical literature, and 3) the software engineering literature. A variety of frameworks and models were found in the biomedical and life sciences literatures. These frameworks and models drew upon and extended frameworks, models and theoretical perspectives that have emerged in other literatures. These frameworks and models are informing an emerging line of research in health and biomedical informatics involving technology-induced errors in healthcare.
Scully, John R
2015-01-01
Recent advances in characterization tools, computational capabilities, and theories have created opportunities for advancement in understanding of solid-fluid interfaces at the nanoscale in corroding metallic systems. The Faraday Discussion on Corrosion Chemistry in 2015 highlighted some of the current needs, gaps and opportunities in corrosion science. Themes were organized into several hierarchical categories that provide an organizational framework for corrosion. Opportunities to develop fundamental physical and chemical data which will enable further progress in thermodynamic and kinetic modelling of corrosion were discussed. These will enable new and better understanding of unit processes that govern corrosion at the nanoscale. Additional topics discussed included scales, films and oxides, fluid-surface and molecular-surface interactions, selected topics in corrosion science and engineering as well as corrosion control. Corrosion science and engineering topics included complex alloy dissolution, local corrosion, and modelling of specific corrosion processes that are made up of collections of temporally and spatially varying unit processes such as oxidation, ion transport, and competitive adsorption. Corrosion control and mitigation topics covered some new insights on coatings and inhibitors. Further advances in operando or in situ experimental characterization strategies at the nanoscale combined with computational modelling will enhance progress in the field, especially if coupling across length and time scales can be achieved incorporating the various phenomena encountered in corrosion. Readers are encouraged to not only to use this ad hoc organizational scheme to guide their immersion into the current opportunities in corrosion chemistry, but also to find value in the information presented in their own ways.
Opportunities-to-Learn at Home: Profiles of Students With and Without Reaching Science Proficiency
NASA Astrophysics Data System (ADS)
Liu, Xiufeng; Whitford, Melinda
2011-08-01
This study examines the relationship between opportunity-to-learn (OTL) at home and students' attainment of science proficiency. The data set used was the 2006 PISA science US national sample. Data mining was used to create patterns of association between home OTL variables and student attainment of science proficiency. It was found that students who failed to reach science proficiency are characterized by having fewer than 100 books at home; these students are also found to take out-of-school individual or group lessons with their teachers or with other teachers. On the other hands, students who reached science proficiency are characterized by having more than 100 books at home, not taking any out-of-school lessons, and having a highest parent level of graduate education. In addition to the above common characteristics, other home characteristics (e.g. computer and internet at home and language spoke at home) are also identified in profiles of students who have reached science proficiency. We explain the above findings in terms of current social-cultural theories. We finally discuss implications of the above findings for future studies and for improving science education policy and practice.
ERIC Educational Resources Information Center
Trifiletti, L. B.; Gielen, A. C.; Sleet, D. A.; Hopkins, K.
2005-01-01
Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury…
Towards Reproducibility in Computational Hydrology
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit
2017-04-01
Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as the example application area, we believe that our conclusions are of value to the wider environmental and geoscience community as far as the use of code and models for scientific advancement is concerned. References: [1] Hutton, C., T. Wagener, J. Freer, D. Han, C. Duffy, and B. Arheimer (2016), Most computational hydrology is not reproducible, so is it really science?, Water Resour. Res., 52, 7548-7555, doi:10.1002/2016WR019285. [2] Ceola, S., et al. (2015), Virtual laboratories: New opportunities for collaborative water science, Hydrol. Earth Syst. Sci. Discuss., 11(12), 13443-13478, doi:10.5194/hessd-11-13443-2014.
NASA Astrophysics Data System (ADS)
Chang, S. S. L.
State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.
Extended Lagrangian Density Functional Tight-Binding Molecular Dynamics for Molecules and Solids.
Aradi, Bálint; Niklasson, Anders M N; Frauenheim, Thomas
2015-07-14
A computationally fast quantum mechanical molecular dynamics scheme using an extended Lagrangian density functional tight-binding formulation has been developed and implemented in the DFTB+ electronic structure program package for simulations of solids and molecular systems. The scheme combines the computational speed of self-consistent density functional tight-binding theory with the efficiency and long-term accuracy of extended Lagrangian Born-Oppenheimer molecular dynamics. For systems without self-consistent charge instabilities, only a single diagonalization or construction of the single-particle density matrix is required in each time step. The molecular dynamics simulation scheme can be applied to a broad range of problems in materials science, chemistry, and biology.
NASA Technical Reports Server (NTRS)
Brooks, Rodney Allen; Stein, Lynn Andrea
1994-01-01
We describe a project to capitalize on newly available levels of computational resources in order to understand human cognition. We will build an integrated physical system including vision, sound input and output, and dextrous manipulation, all controlled by a continuously operating large scale parallel MIMD computer. The resulting system will learn to 'think' by building on its bodily experiences to accomplish progressively more abstract tasks. Past experience suggests that in attempting to build such an integrated system we will have to fundamentally change the way artificial intelligence, cognitive science, linguistics, and philosophy think about the organization of intelligence. We expect to be able to better reconcile the theories that will be developed with current work in neuroscience.
Incorporating time and spatial-temporal reasoning into situation management
NASA Astrophysics Data System (ADS)
Jakobson, Gabriel
2010-04-01
Spatio-temporal reasoning plays a significant role in situation management that is performed by intelligent agents (human or machine) by affecting how the situations are recognized, interpreted, acted upon or predicted. Many definitions and formalisms for the notion of spatio-temporal reasoning have emerged in various research fields including psychology, economics and computer science (computational linguistics, data management, control theory, artificial intelligence and others). In this paper we examine the role of spatio-temporal reasoning in situation management, particularly how to resolve situations that are described by using spatio-temporal relations among events and situations. We discuss a model for describing context sensitive temporal relations and show have the model can be extended for spatial relations.
Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.
2014-12-01
The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.
Theories of the Earth and the Nature of Science.
ERIC Educational Resources Information Center
Williams, James
1991-01-01
Describes the history of the science of geology. The author expounds upon the discovery of deep time and plate tectonics, explaining how the theory of deep time influenced the development of Darwin and Wallace's theory of evolution. Describes how the history of earth science helps students understand the nature of science. (PR)
Freed, Karl F
2014-10-14
A general theory of the long time, low temperature dynamics of glass-forming fluids remains elusive despite the almost 20 years since the famous pronouncement by the Nobel Laureate P. W. Anderson, "The deepest and most interesting unsolved problem in solid state theory is probably the theory of the nature of glass and the glass transition" [Science 267, 1615 (1995)]. While recent work indicates that Adam-Gibbs theory (AGT) provides a framework for computing the structural relaxation time of supercooled fluids and for analyzing the properties of the cooperatively rearranging dynamical strings observed in low temperature molecular dynamics simulations, the heuristic nature of AGT has impeded general acceptance due to the lack of a first principles derivation [G. Adam and J. H. Gibbs, J. Chem. Phys. 43, 139 (1965)]. This deficiency is rectified here by a statistical mechanical derivation of AGT that uses transition state theory and the assumption that the transition state is composed of elementary excitations of a string-like form. The strings are assumed to form in equilibrium with the mobile particles in the fluid. Hence, transition state theory requires the strings to be in mutual equilibrium and thus to have the size distribution of a self-assembling system, in accord with the simulations and analyses of Douglas and co-workers. The average relaxation rate is computed as a grand canonical ensemble average over all string sizes, and use of the previously determined relation between configurational entropy and the average cluster size in several model equilibrium self-associating systems produces the AGT expression in a manner enabling further extensions and more fundamental tests of the assumptions.
NASA Astrophysics Data System (ADS)
Freed, Karl F.
2014-10-01
A general theory of the long time, low temperature dynamics of glass-forming fluids remains elusive despite the almost 20 years since the famous pronouncement by the Nobel Laureate P. W. Anderson, "The deepest and most interesting unsolved problem in solid state theory is probably the theory of the nature of glass and the glass transition" [Science 267, 1615 (1995)]. While recent work indicates that Adam-Gibbs theory (AGT) provides a framework for computing the structural relaxation time of supercooled fluids and for analyzing the properties of the cooperatively rearranging dynamical strings observed in low temperature molecular dynamics simulations, the heuristic nature of AGT has impeded general acceptance due to the lack of a first principles derivation [G. Adam and J. H. Gibbs, J. Chem. Phys. 43, 139 (1965)]. This deficiency is rectified here by a statistical mechanical derivation of AGT that uses transition state theory and the assumption that the transition state is composed of elementary excitations of a string-like form. The strings are assumed to form in equilibrium with the mobile particles in the fluid. Hence, transition state theory requires the strings to be in mutual equilibrium and thus to have the size distribution of a self-assembling system, in accord with the simulations and analyses of Douglas and co-workers. The average relaxation rate is computed as a grand canonical ensemble average over all string sizes, and use of the previously determined relation between configurational entropy and the average cluster size in several model equilibrium self-associating systems produces the AGT expression in a manner enabling further extensions and more fundamental tests of the assumptions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freed, Karl F., E-mail: freed@uchicago.edu
A general theory of the long time, low temperature dynamics of glass-forming fluids remains elusive despite the almost 20 years since the famous pronouncement by the Nobel Laureate P. W. Anderson, “The deepest and most interesting unsolved problem in solid state theory is probably the theory of the nature of glass and the glass transition” [Science 267, 1615 (1995)]. While recent work indicates that Adam-Gibbs theory (AGT) provides a framework for computing the structural relaxation time of supercooled fluids and for analyzing the properties of the cooperatively rearranging dynamical strings observed in low temperature molecular dynamics simulations, the heuristic naturemore » of AGT has impeded general acceptance due to the lack of a first principles derivation [G. Adam and J. H. Gibbs, J. Chem. Phys. 43, 139 (1965)]. This deficiency is rectified here by a statistical mechanical derivation of AGT that uses transition state theory and the assumption that the transition state is composed of elementary excitations of a string-like form. The strings are assumed to form in equilibrium with the mobile particles in the fluid. Hence, transition state theory requires the strings to be in mutual equilibrium and thus to have the size distribution of a self-assembling system, in accord with the simulations and analyses of Douglas and co-workers. The average relaxation rate is computed as a grand canonical ensemble average over all string sizes, and use of the previously determined relation between configurational entropy and the average cluster size in several model equilibrium self-associating systems produces the AGT expression in a manner enabling further extensions and more fundamental tests of the assumptions.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
The challenges of developing computational physics: the case of South Africa
NASA Astrophysics Data System (ADS)
Salagaram, T.; Chetty, N.
2013-08-01
Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry.
NASA Astrophysics Data System (ADS)
Bell, Peter M.
Artificial intelligence techniques are being used for the first time to evaluate geophysical, geochemical, and geologic data and theory in order to locate ore deposits. After several years of development, an intelligent computer code has been formulated and applied to the Mount Tolman area in Washington state. In a project funded by the United States Geological Survey and the National Science Foundation a set of computer programs, under the general title Prospector, was used successfully to locate a previously unknown ore-grade porphyry molybdenum deposit in the vicinity of Mount Tolman (Science, Sept. 3, 1982).The general area of the deposit had been known to contain exposures of porphyry mineralization. Between 1964 and 1978, exploration surveys had been run by the Bear Creek Mining Company, and later exploration was done in the area by the Amax Corporation. Some of the geophysical data and geochemical and other prospecting surveys were incorporated into the programs, and mine exploration specialists contributed to a set of rules for Prospector. The rules were encoded as ‘inference networks’ to form the ‘expert system’ on which the artificial intelligence codes were based. The molybdenum ore deposit discovered by the test is large, located subsurface, and has an areal extent of more than 18 km2.
NASA Astrophysics Data System (ADS)
Voute, S.; Kleinhans, M. G.; de Regt, H.
2010-12-01
A scientific explanation for a phenomenon is based on relevant theory and initial and background conditions. Scientific understanding, on the other hand, requires intelligibility, which means that a scientist can recognise qualitative characteristic consequences of the theory without doing the actual calculations, and apply it to develop further explanations and predictions. If explanation and understanding are indeed fundamentally different, then it may be possible to convey understanding of earth-scientific phenomena to laymen without the full theoretical background. The aim of this thesis is to analyze how scientists and laymen gain scientific understanding in Earth Sciences, based on the newest insights in the philosophy of science, pedagogy, and science communication. All three disciplines have something to say about how humans learn and understand, even if at very different levels of scientists, students, children or the general public. If different disciplines with different approaches identify and quantify the same theory in the same manner, then there is likely to be something “real” behind the theory. Comparing methodology and learning styles of the different disciplines within the Earth Sciences and by critically analyze earth-scientific exhibitions in different museums may provide insight in the different approaches for earth-scientific explanation and communication. In order to gain earth-scientific understanding, a broad suite of tools is used, such as maps and images, symbols and diagrams, cross-sections and sketches, categorization and classification, modelling, laboratory experiments, (computer) simulations and analogies, remote sensing, and fieldwork. All these tools have a dual nature, containing both theoretical and embodied components. Embodied knowledge is created by doing the actual modelling, intervening in experiments and doing fieldwork. Scientific practice includes discovery and exploration, data collection and analyses, verification or falsification and conclusions that must be well grounded and argued. The intelligibility of theories is improved by the combination of these two types of understanding. This is also attested by the fact that both theoretical and embodied skills are considered essential for the training of university students at all levels. However, from surprised and confounded reactions of the public to natural disasters it appears that just showing scientific results is not enough to convey the scientific understanding to the public. By using the tools used by earth scientists to develop explanations and achieve understanding, laymen could achieve understanding as well without rigorous theoretical training. We are presently investigating in science musea whether engaging the public in scientific activities based on embodied skills leads to understanding of earth-scientific phenomena by laymen.
Information Theory, Inference and Learning Algorithms
NASA Astrophysics Data System (ADS)
Mackay, David J. C.
2003-10-01
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
NASA Astrophysics Data System (ADS)
Wright, D. J.
2013-12-01
In the early 1990s the author came of age as the technology driving the geographic information system or GIS was beginning to successfully 'handle' geospatial data at a range of scales and formats, and a wide array of information technology products emerged from an expanding GIS industry. However, that small community struggled to reflect the diverse research efforts at play in understanding the deeper issues surrounding geospatial data, and the impediments to that effective use of that data. It was from this need that geographic information science or GIScience arose, to ensure in part that GIS did not fall into the trap of being a technology in search of applications, a one-time, one-off, non-intellectual 'bag of tricks' with no substantive theory underpinning it, and suitable only for a static period of time (e.g., Goodchild, 1992). The community has since debated the issue of "tool versus science' which has also played a role in defining GIS as an actual profession. In turn, GIS has contributed to "methodological versus substantive" questions in science, leading to understandings of how the Earth works versus how the Earth should look. In the author's experience, the multidimensional structuring and scaling data, with integrative and innovative approaches to analyzing, modeling, and developing extensive and spatial data from selected places on land and at sea, have revealed how theory and application are in no way mutually exclusive, and it may often be application that advances theory, rather than vice versa. Increasingly, both the system and science of geographic information have welcomed strong collaborations among computer scientists, information scientists, and domain scientists to solve complex scientific questions. As such, they have paralleled the emergence and acceptance of "data science." And now that we are squarely in an era of regional- to global-scale observation and simulation of the Earth, produce data that are too big, move too fast, and do not fit the structures and processing capacity of conventional database systems, and the author reflects on how the potential of the GIS/GIScience world to contribute to the training and professional advancement of data science.
Deep learning for computational chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goh, Garrett B.; Hodas, Nathan O.; Vishnu, Abhinav
The rise and fall of artificial neural networks is well documented in the scientific literature of both the fields of computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on “deep” neural networks. Within the last few years, we have seen the transformative impact of deep learning the computer science domain, notably in speech recognition and computer vision, to the extent that the majority of practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. Inmore » this review, we provide an introductory overview into the theory of deep neural networks and their unique properties as compared to traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including QSAR, virtual screening, protein structure modeling, QM calculations, materials synthesis and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non neural networks state-of-the-art models across disparate research topics, and deep neural network based models often exceeded the “glass ceiling” expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a useful tool and may grow into a pivotal role for various challenges in the computational chemistry field.« less
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Statistical mechanical theory for steady state systems. VI. Variational principles
NASA Astrophysics Data System (ADS)
Attard, Phil
2006-12-01
Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.
Soto, Fabian A; Zheng, Emily; Fonseca, Johnny; Ashby, F Gregory
2017-01-01
Determining whether perceptual properties are processed independently is an important goal in perceptual science, and tools to test independence should be widely available to experimental researchers. The best analytical tools to test for perceptual independence are provided by General Recognition Theory (GRT), a multidimensional extension of signal detection theory. Unfortunately, there is currently a lack of software implementing GRT analyses that is ready-to-use by experimental psychologists and neuroscientists with little training in computational modeling. This paper presents grtools , an R package developed with the explicit aim of providing experimentalists with the ability to perform full GRT analyses using only a couple of command lines. We describe the software and provide a practical tutorial on how to perform each of the analyses available in grtools . We also provide advice to researchers on best practices for experimental design and interpretation of results when applying GRT and grtools .
[Comments on the definition of "acupuncture science"].
Wang, Fan
2017-12-12
The experts in China believe that the substance of "dry needling" is in the category of acupuncture therapy for the treatment by needle inserting the human body. But, its recognition has not been implied from the definition of "acupuncture science". Since 1970 s, the different definitions of it are closely related to TCM theories, due to which, it has been limited. This flaw restricts the development of acupuncture theory, narrows the connotation of acupuncture science and goes against the communication of traditional Chinese acupuncture theory. No matter regarding the theory or technique, the acupuncture therapy nowadays changes greatly in its connotation. Rather than guided by TCM theories, acupuncture therapy mainly includes the nerve trunk stimulation theory, the cerebral function orientation therapy, biological holographic therapy, fascia stimulation therapy and trigger therapy, etc. Expect that the medical devices used in these therapies are same as the traditional acupuncture, these methods cannot be regarded in the category of acupuncture science when the current definition of it is considered. Hence, the writer is trying to define "acupuncture science" as: acupuncture science refers to the science for the methodology and mechanism of therapeutic devices, e.g. acupuncture therapy and moxibustion therapy, for the prevention and treatment of disease by stimulating the body, and its theory includes but not limits in traditional Chinese medical theory.
Representations of Intervals and Optimal Error Bounds.
1980-07-01
OAA629-8O-C-0ONI UNCLASS I FI IEDMRC TSR-2098 NL 11111L 3 -2 11111 ~ 13.6 1111 125 .4 111.6 MCROCOPY RESOLUTION TEST CHART NATIONA’ 13UREAU OF STANDARDS...geometric and harmonic means, Excess width Work Unit Number 3 (Numerical Analysis and Computer Science) Sponsored by the United States Army under...example in the next section, following which the general theory will be dis- cussed. 3 . An example of an optimal point and error bound. A simple
Salvadori, Andrea; Del Frate, Gianluca; Pagliai, Marco; Mancini, Giordano; Barone, Vincenzo
2016-11-15
The role of Virtual Reality (VR) tools in molecular sciences is analyzed in this contribution through the presentation of the Caffeine software to the quantum chemistry community. Caffeine, developed at Scuola Normale Superiore, is specifically tailored for molecular representation and data visualization with VR systems, such as VR theaters and helmets. Usefulness and advantages that can be gained by exploiting VR are here reported, considering few examples specifically selected to illustrate different level of theory and molecular representation.
Joint University Program for Air Transportation Research, 1986
NASA Technical Reports Server (NTRS)
Morrell, Frederick R. (Compiler)
1988-01-01
The research conducted under the NASA/FAA sponsored Joint University Program for Air Transportation Research is summarized. The Joint University Program is a coordinated set of three grants sponsored by NASA and the FAA, one each with the Mass. Inst. of Tech., Ohio Univ., and Princeton Univ. Completed works, status reports, and bibliographies are presented for research topics, which include computer science, guidance and control theory and practice, aircraft performance, flight dynamics, and applied experimental psychology. An overview of activities is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bansil, Arun
2016-12-01
Basic-Energy Sciences of the Department of Energy (BES/DOE) has made large investments in x-ray sources in the U.S. (NSLS-II, LCLS, NGLS, ALS, APS) as powerful enabling tools for opening up unprecedented new opportunities for exploring properties of matter at various length and time scales. The coming online of the pulsed photon source literally allows us to see and follow the dynamics of processes in materials at their natural timescales. There is an urgent need therefore to develop theoretical methodologies and computational models for understanding how x-rays interact with matter and the related spectroscopies of materials. The present project addressed aspectsmore » of this grand challenge of X-ray science. In particular, our Collaborative Research Team (CRT) focused on understanding and modeling of elastic and inelastic resonant X-ray scattering processes. We worked to unify the three different computational approaches currently used for modeling X-ray scattering—density functional theory, dynamical mean-field theory, and small-cluster exact diagonalization—to achieve a more realistic material-specific picture of the interaction between X-rays and complex matter. To achieve a convergence in the interpretation and to maximize complementary aspects of different theoretical methods, we concentrated on the cuprates, where most experiments have been performed. Our team included both US and international researchers, and it fostered new collaborations between researchers currently working with different approaches. In addition, we developed close relationships with experimental groups working in the area at various synchrotron facilities in the US. Our CRT thus helped toward enabling the US to assume a leadership role in the theoretical development of the field, and to create a global network and community of scholars dedicated to X-ray scattering research.« less
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
Hong, Felix T
2013-09-01
Rosen classified sciences into two categories: formalizable and unformalizable. Whereas formalizable sciences expressed in terms of mathematical theories were highly valued by Rutherford, Hutchins pointed out that unformalizable parts of soft sciences are of genuine interest and importance. Attempts to build mathematical theories for biology in the past century was met with modest and sporadic successes, and only in simple systems. In this article, a qualitative model of humans' high creativity is presented as a starting point to consider whether the gap between soft and hard sciences is bridgeable. Simonton's chance-configuration theory, which mimics the process of evolution, was modified and improved. By treating problem solving as a process of pattern recognition, the known dichotomy of visual thinking vs. verbal thinking can be recast in terms of analog pattern recognition (non-algorithmic process) and digital pattern recognition (algorithmic process), respectively. Additional concepts commonly encountered in computer science, operations research and artificial intelligence were also invoked: heuristic searching, parallel and sequential processing. The refurbished chance-configuration model is now capable of explaining several long-standing puzzles in human cognition: a) why novel discoveries often came without prior warning, b) why some creators had no ideas about the source of inspiration even after the fact, c) why some creators were consistently luckier than others, and, last but not least, d) why it was so difficult to explain what intuition, inspiration, insight, hunch, serendipity, etc. are all about. The predictive power of the present model was tested by means of resolving Zeno's paradox of Achilles and the Tortoise after one deliberately invoked visual thinking. Additional evidence of its predictive power must await future large-scale field studies. The analysis was further generalized to constructions of scientific theories in general. This approach is in line with Campbell's evolutionary epistemology. Instead of treating science as immutable Natural Laws, which already existed and which were just waiting to be discovered, scientific theories are regarded as humans' mental constructs, which must be invented to reconcile with observed natural phenomena. In this way, the pursuit of science is shifted from diligent and systematic (or random) searching for existing Natural Laws to firing up humans' imagination to comprehend Nature's behavioral pattern. The insights gained in understanding human creativity indicated that new mathematics that is capable of handling effectively parallel processing and human subjectivity is sorely needed. The past classification of formalizability vs. non-formalizability was made in reference to contemporary mathematics. Rosen's conclusion did not preclude future inventions of new biology-friendly mathematics. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stinson, Jake L.; Kathmann, Shawn M.; Ford, Ian J.
2014-01-14
The nucleation of particles from trace gases in the atmosphere is an important source of cloud condensation nuclei (CCN), and these are vital for the formation of clouds in view of the high supersaturations required for homogeneous water droplet nucleation. The methods of quantum chemistry have increasingly been employed to model nucleation due to their high accuracy and efficiency in calculating configurational energies; and nucleation rates can be obtained from the associated free energies of particle formation. However, even in such advanced approaches, it is typically assumed that the nuclei have a classical nature, which is questionable for some systems.more » The importance of zero-point motion (also known as quantum nuclear dynamics) in modelling small clusters of sulphuric acid and water is tested here using the path integral molecular dynamics (PIMD) method at the density functional theory (DFT) level of theory. We observe a small zero-point effect on the the equilibrium structures of certain clusters. One configuration is found to display a bimodal behaviour at 300 K in contrast to the stable ionised state suggested from a zero temperature classical geometry optimisation. The general effect of zero-point motion is to promote the extent of proton transfer with respect to classical behaviour. We thank Prof. Angelos Michaelides and his group in University College London (UCL) for practical advice and helpful discussions. This work benefited from interactions with the Thomas Young Centre through seminar and discussions involving the PIMD method. SMK was supported by the U.S. Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. JLS and IJF were supported by the IMPACT scheme at UCL and by the U.S. Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. We are grateful for use of the UCL Legion High Performance Computing Facility and the resources of the National Energy Research Scientific Computing Center (NERSC), which is supported by the U.S. Department of Energy, Office of Science of the under Contract No. DE-AC02-05CH11231.« less
Naive Probability: Model-Based Estimates of Unique Events.
Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N
2015-08-01
We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Blancke, Stefaan; De Smedt, Johan; De Cruz, Helen; Boudry, Maarten; Braeckman, Johan
2012-08-01
This paper discusses the relationship between religion and science education in the light of the cognitive sciences. We challenge the popular view that science and religion are compatible, a view that suggests that learning and understanding evolutionary theory has no effect on students' religious beliefs and vice versa. We develop a cognitive perspective on how students manage to reconcile evolutionary theory with their religious beliefs. We underwrite the claim developed by cognitive scientists and anthropologists that religion is natural because it taps into people's intuitive understanding of the natural world which is constrained by essentialist, teleological and intentional biases. After contrasting the naturalness of religion with the unnaturalness of science, we discuss the difficulties cognitive and developmental scientists have identified in learning and accepting evolutionary theory. We indicate how religious beliefs impede students' understanding and acceptance of evolutionary theory. We explore a number of options available to students for reconciling an informed understanding of evolutionary theory with their religious beliefs. To conclude, we discuss the implications of our account for science and biology teachers.
Science preparedness and science response: perspectives on the dynamics of preparedness conference.
Lant, Timothy; Lurie, Nicole
2013-01-01
The ability of the scientific modeling community to meaningfully contribute to postevent response activities during public health emergencies was the direct result of a discrete set of preparedness activities as well as advances in theory and technology. Scientists and decision-makers have recognized the value of developing scientific tools (e.g. models, data sets, communities of practice) to prepare them to be able to respond quickly--in a manner similar to preparedness activities by first-responders and emergency managers. Computational models have matured in their ability to better inform response plans by modeling human behaviors and complex systems. We advocate for further development of science preparedness activities as deliberate actions taken in advance of an unpredicted event (or an event with unknown consequences) to increase the scientific tools and evidence-base available to decision makers and the whole-of-community to limit adverse outcomes.
Science and technology in the stockpile stewardship program, S & TR reprints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Storm, E
This document reports on these topics: Computer Simulations in Support of National Security; Enhanced Surveillance of Aging Weapons; A New Precision Cutting Tool: The Femtosecond Laser; Superlasers as a Tool of Stockpile Stewardship; Nova Laser Experiments and Stockpile Stewardship; Transforming Explosive Art into Science; Better Flash Radiography Using the FXR; Preserving Nuclear Weapons Information; Site 300Õs New Contained Firing Facility; The Linear Electric Motor: Instability at 1,000 gÕs; A Powerful New Tool to Detect Clandestine Nuclear Tests; High Explosives in Stockpile Surveillance Indicate Constancy; Addressing a Cold War Legacy with a New Way to Produce TATB; JumpinÕ Jupiter! Metallic Hydrogen;more » Keeping the Nuclear Stockpile Safe, Secure, and Reliable; The Multibeam FabryÐPerot Velocimeter: Efficient Measurements of High Velocities; Theory and Modeling in Material Science; The Diamond Anvil Cell; Gamma-Ray Imaging Spectrometry; X-Ray Lasers and High-Density Plasma« less
Alien Mindscapes—A Perspective on the Search for Extraterrestrial Intelligence
NASA Astrophysics Data System (ADS)
Cabrol, Nathalie A.
2016-09-01
Advances in planetary and space sciences, astrobiology, and life and cognitive sciences, combined with developments in communication theory, bioneural computing, machine learning, and big data analysis, create new opportunities to explore the probabilistic nature of alien life. Brought together in a multidisciplinary approach, they have the potential to support an integrated and expanded Search for Extraterrestrial Intelligence (SETI1), a search that includes looking for life as we do not know it. This approach will augment the odds of detecting a signal by broadening our understanding of the evolutionary and systemic components in the search for extraterrestrial intelligence (ETI), provide more targets for radio and optical SETI, and identify new ways of decoding and coding messages using universal markers.
New Sociotechnical Insights in Interaction Design
NASA Astrophysics Data System (ADS)
Abdelnour-Nocera, José; Mørch, Anders I.
New challenges are facing interaction design. On one hand because of advances in technology - pervasive, ubiquitous, multimodal and adaptive computing - are changing the nature of interaction. On the other, web 2.0, massive multiplayer games and collaboration software extends the boundaries of HCI to deal with interaction in settings of remote communication and collaboration. The aim of this workshop is to provide a forum for HCI practitioners and researchers interested in knowledge from the social sciences to discuss how sociotechnical insights can be used to inform interaction design, and more generally how social science methods and theories can help to enrich the conceptual framework of systems development and participatory design. Position papers submissions are invited to address key aspects of current research and practical case studies.
Reasons for 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D.; Escher, J.; Hoffman, R.
LLNL's Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to create the 2011 release of the Evaluated Nuclear Data Library (ENDL2011). ENDL2011 is designed to sup- port LLNL's current and future nuclear data needs. This database is currently the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles, surpassing ENDL2009.0 [1]. The ENDL2011 release [2] contains 918 transport-ready eval- uations in the neutron sub-library alone. ENDL2011 was assembled with strong support from the ASC program, leveraged with support from NNSA science campaigns and the DOE/Offce of Science US Nuclear Datamore » Pro- gram.« less
Quantum Monte Carlo simulations of Ti4 O7 Magnéli phase
NASA Astrophysics Data System (ADS)
Benali, Anouar; Shulenburger, Luke; Krogel, Jaron; Zhong, Xiaoliang; Kent, Paul; Heinonen, Olle
2015-03-01
Ti4O7 is ubiquitous in Ti-oxides. It has been extensively studied, both experimentally and theoretically in the past decades using multiple levels of theories, resulting in multiple diverse results. The latest DFT +SIC methods and state of the art HSE06 hybrid functionals even propose a new anti-ferromagnetic state at low temperature. Using Quantum Monte Carlo (QMC), as implemented in the QMCPACK simulation package, we investigated the electronic and magnetic properties of Ti4O7 at low (120K) and high (298K) temperatures and at different magnetic states. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357. L.S, J.K and P.K were supported through Predictive Theory and Modeling for Materials and Chemical Science program by the Office of Basic Energy Sciences (BES), Department of Energy (DOE) Sandia National Laboratories is a multiprogram laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract No. DE-AC04-94AL85000.
Theories of Levels in Organizational Science.
ERIC Educational Resources Information Center
Rousseau, Denise M.
This paper presents concepts and principles pertinent to the development of cross-level and multilevel theory in organizational science by addressing a number of fundamental theoretical issues. It describes hierarchy theory, systems theory, and mixed-level models of organization developed by organizational scientists. Hierarchy theory derives from…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miliordos, Evangelos; Xantheas, Sotiris S.
2015-06-21
We report MP2 and CCSD(T) binding energies with basis sets up to pentuple zeta quality for the m = 2-6, 8 clusters. Or best CCSD(T)/CBS estimates are -4.99 kcal/mol (dimer), -15.77 kcal/mol (trimer), -27.39 kcal/mol (tetramer), -35.9 ± 0.3 kcal/mol (pentamer), -46.2 ± 0.3 kcal/mol (prism hexamer), -45.9 ± 0.3 kcal/mol (cage hexamer), -45.4 ± 0.3 kcal/mol (book hexamer), -44.3 ± 0.3 kcal/mol (ring hexamer), -73.0 ± 0.5 kcal/mol (D 2d octamer) and -72.9 ± 0.5 kcal/mol (S4 octamer). We have found that the percentage of both the uncorrected (dimer) and BSSE-corrected (dimer CP e) binding energies recovered with respectmore » to the CBS limit falls into a narrow range for each basis set for all clusters and in addition this range was found to decrease upon increasing the basis set. Relatively accurate estimates (within < 0.5%) of the CBS limits can be obtained when using the “ 2/3, 1/3” (for the AVDZ set) or the “½ , ½” (for the AVTZ, AVQZ and AV5Z sets) mixing ratio between dimer e and dimer CPe. Based on those findings we propose an accurate and efficient computational protocol that can be used to estimate accurate binding energies of clusters at the MP2 (for up to 100 molecules) and CCSD(T) (for up to 30 molecules) levels of theory. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is a multi program national laboratory operated for DOE by Battelle. This research also used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. AC02-05CH11231.« less
Computational structures technology and UVA Center for CST
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1992-01-01
Rapid advances in computer hardware have had a profound effect on various engineering and mechanics disciplines, including the materials, structures, and dynamics disciplines. A new technology, computational structures technology (CST), has recently emerged as an insightful blend between material modeling, structural and dynamic analysis and synthesis on the one hand, and other disciplines such as computer science, numerical analysis, and approximation theory, on the other hand. CST is an outgrowth of finite element methods developed over the last three decades. The focus of this presentation is on some aspects of CST which can impact future airframes and propulsion systems, as well as on the newly established University of Virginia (UVA) Center for CST. The background and goals for CST are described along with the motivations for developing CST, and a brief discussion is made on computational material modeling. We look at the future in terms of technical needs, computing environment, and research directions. The newly established UVA Center for CST is described. One of the research projects of the Center is described, and a brief summary of the presentation is given.
The melting temperature of liquid water with the effective fragment potential
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brorsen, Kurt R.; Willow, Soohaeng Y.; Xantheas, Sotiris S.
2015-09-17
Direct simulation of the solid-liquid water interface with the effective fragment potential (EFP) via the constant enthalpy and pressure (NPH) ensemble was used to estimate the melting temperature (Tm) of ice-Ih. Initial configurations and velocities, taken from equilibrated constant pressure and temperature (NPT) simulations at T = 300 K, 350 K and 400 K, respectively, yielded corresponding Tm values of 378±16 K, 382±14 K and 384±15 K. These estimates are consistently higher than experiment, albeit to the same degree with previously reported estimates using density functional theory (DFT)-based Born-Oppenheimer simulations with the Becke-Lee-Yang-Parr functional plus dispersion corrections (BLYP-D). KRB wasmore » supported by a Computational Science Graduate Fellowship from the Department of Energy. MSG was supported by a U.S. National Science Foundation Software Infrastructure (SI2) grant (ACI – 1047772). SSX acknowledges support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle.« less
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
Geomorphology as science: the role of theory
NASA Astrophysics Data System (ADS)
Rhoads, Bruce L.; Thorn, Colin E.
1993-04-01
Because geomorphology is a science, it is permeated by theory. Overt recognition of this actuality is frequently resisted by geomorphologists. Earth history does not represent an alternative to earth science, it is an essential component of earth science. In its broadest sense science seeks to discover new knowledge through a two-stage activity involving the creation and justification of ideas (theory). Deduction is generally regarded as the only logically-consistent method of justifying ideas. The creation of ideas is a much more controversial topic. Some methodologists deem it beyond logic; certainly deductive arguments, which are nothing more than formal, logical expressions of theory, play no role in the conception of new ideas. Many earth scientists generate possible explanations of observed phenomena based on abductive reasoning. Others advocate reliance on purported forms of "pure" induction, such as serendipity and intuition, in which observations assume primacy over theory. Besides lacking consistency and educability, the latter posture is flawed because it mistakenly implies that becoming well-versed in theory is irrelevant to or impedes scientific discovery. Irrational or subjective factors play a role in the creation of ideas, but it is erroneous to claim that these factors are divorced from theory. Science is first and foremost a cognitive activity; thus, the primacy of observations in science is a myth. All observations are theory-laden in the sense that the act of observation inherently involves interpretation and classification, both of which can only occur within the context of theoretical preconceptions. Even discoveries based on unexpected observations require the fortunate investigator to recognize the theoretical importance of what is seen or measured. The most useful view of geomorphology as a science is one in which theory is seen as central, but fragile, and in which theory and observation are viewed symbiotically with theory providing the generative force and observation providing a vital policing role. Much of the current debate in geomorphology centers around differences in characteristics of theory, type of scientific arguments, and metaphysical perspectives among investigators working at different temporal scales. Full recognition and understanding of these differences are essential for developing a unified approach to the science of geomorphology.
Laboratory Directed Research and Development FY2011 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craig, W; Sketchley, J; Kotta, P
2012-03-22
A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser Inertial-Fusion Energy; (12) Advanced Laser Optical Systems and Applications; (12) Space Security; (13) Stockpile Stewardship Science; (14) National Security; (15) Alternative Energy; and (16) Climatic Change.« less
Richard Feynman and computation
NASA Astrophysics Data System (ADS)
Hey, Tony
1999-04-01
The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.
Can mathematics explain the evolution of human language?
Witzany, Guenther
2011-09-01
Investigation into the sequence structure of the genetic code by means of an informatic approach is a real success story. The features of human language are also the object of investigation within the realm of formal language theories. They focus on the common rules of a universal grammar that lies behind all languages and determine generation of syntactic structures. This universal grammar is a depiction of material reality, i.e., the hidden logical order of things and its relations determined by natural laws. Therefore mathematics is viewed not only as an appropriate tool to investigate human language and genetic code structures through computer science-based formal language theory but is itself a depiction of material reality. This confusion between language as a scientific tool to describe observations/experiences within cognitive constructed models and formal language as a direct depiction of material reality occurs not only in current approaches but was the central focus of the philosophy of science debate in the twentieth century, with rather unexpected results. This article recalls these results and their implications for more recent mathematical approaches that also attempt to explain the evolution of human language.
The birth and evolution of surface science: Child of the union of science and technology
Duke, C. B.
2003-01-01
This article is an account of the birth and evolution of surface science as an interdisciplinary research area. Surface science emanated from the confluence of concepts and tools in physics and chemistry with technological innovations that made it possible to determine the structure and properties of surfaces and interfaces and the dynamics of chemical reactions at surfaces. The combination in the 1960s and 1970s of ultra-high-vacuum (i.e., P < 10−7 Pascal or 10−9 Torr) technology with the recognition that electrons in the energy range from 50 to 500 eV exhibited inelastic collision mean free paths of the order of a few angstroms fostered an explosion of activity. The results were a reformulation of the theory of electron solid scattering, the nearly universal use of electron spectroscopies for surface characterization, the rise of surface science as an independent interdisciplinary research area, and the emergence of the American Vacuum Society (AVS) as a major international scientific society. The rise of microelectronics in the 1970s and 1980s resulted in huge increases in computational power. These increases enabled more complex experiments and the utilization of density functional theory for the quantitative prediction of surface structure and dynamics. Development of scanning-probe microscopies in the 1990s led to atomic-resolution images of macroscopic surfaces and interfaces as well as videos of atoms moving about on surfaces during growth and diffusion. Scanning probes have since brought solid–liquid interfaces into the realm of atomic-level surface science, expanding its scope to more complex systems, including fragile biological materials and processes. PMID:12651946
NASA Astrophysics Data System (ADS)
Shi, X.
2015-12-01
As NSF indicated - "Theory and experimentation have for centuries been regarded as two fundamental pillars of science. It is now widely recognized that computational and data-enabled science forms a critical third pillar." Geocomputation is the third pillar of GIScience and geosciences. With the exponential growth of geodata, the challenge of scalable and high performance computing for big data analytics become urgent because many research activities are constrained by the inability of software or tool that even could not complete the computation process. Heterogeneous geodata integration and analytics obviously magnify the complexity and operational time frame. Many large-scale geospatial problems may be not processable at all if the computer system does not have sufficient memory or computational power. Emerging computer architectures, such as Intel's Many Integrated Core (MIC) Architecture and Graphics Processing Unit (GPU), and advanced computing technologies provide promising solutions to employ massive parallelism and hardware resources to achieve scalability and high performance for data intensive computing over large spatiotemporal and social media data. Exploring novel algorithms and deploying the solutions in massively parallel computing environment to achieve the capability for scalable data processing and analytics over large-scale, complex, and heterogeneous geodata with consistent quality and high-performance has been the central theme of our research team in the Department of Geosciences at the University of Arkansas (UARK). New multi-core architectures combined with application accelerators hold the promise to achieve scalability and high performance by exploiting task and data levels of parallelism that are not supported by the conventional computing systems. Such a parallel or distributed computing environment is particularly suitable for large-scale geocomputation over big data as proved by our prior works, while the potential of such advanced infrastructure remains unexplored in this domain. Within this presentation, our prior and on-going initiatives will be summarized to exemplify how we exploit multicore CPUs, GPUs, and MICs, and clusters of CPUs, GPUs and MICs, to accelerate geocomputation in different applications.
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
High School Students' Implicit Theories of What Facilitates Science Learning
ERIC Educational Resources Information Center
Parsons, Eileen Carlton; Miles, Rhea; Petersen, Michael
2011-01-01
Background: Research has primarily concentrated on adults' implicit theories about high quality science education for all students. Little work has considered the students' perspective. This study investigated high school students' implicit theories about what helped them learn science. Purpose: This study addressed (1) What characterizes high…
Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B
2010-02-01
The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell biology has advanced rapidly with increasing availability of new data and powerful simulation techniques, a quantitative orientation is still lacking in life sciences education to make efficient use of these new tools to implement the new toxicity testing paradigm. A revamped undergraduate curriculum in the biological sciences including compulsory courses in mathematics and analysis of dynamical systems is required to address this gap. In parallel, dissemination of computational systems biology techniques and other analytical tools among practicing toxicologists and risk assessment professionals will help accelerate implementation of the new toxicity testing vision.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maitra, Neepa
The first US-based summer school and workshop on Time-Dependent Density Functional Theory (TDDFT) was held July 11-21, 2017 in Telluride, CO. This grant provided funding to enable 33 students to attend the school, specifically with lodging and registration fee reductions. TDDFT is increasingly used in computational molecular and materials science to calculate electronic-excitation spectra and dynamics in a wide variety of applications, including photocatalysis, photo-controlled bond dissociation, and light-induced charge transfer. Software development in this community targets multiple software packages, many of which are open source, such as octopus, NWchem and Qb@ll, which are the ones our school focused on.more » The goal of this first iteration was to create a home for a national community of scholars, including users and developers, with a deep understanding of TDDFT, its capabilities, limitations, and high-performance computing context. We used this opportunity to explore interest in such an event in the future and based on overwhelmingly positive feedback from students and teachers, we intend to hold a similar school+workshop every two years in the US, in order to maintain the high level of interest that we witnessed and the enthusiasm amongst participants.« less
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
NASA Astrophysics Data System (ADS)
Aourag, H.
2008-09-01
In the past, the search for new and improved materials was characterized mostly by the use of empirical, trial- and-error methods. This picture of materials science has been changing as the knowledge and understanding of fundamental processes governing a material's properties and performance (namely, composition, structure, history, and environment) have increased. In a number of cases, it is now possible to predict a material's properties before it has even been manufactured thus greatly reducing the time spent on testing and development. The objective of modern materials science is to tailor a material (starting with its chemical composition, constituent phases, and microstructure) in order to obtain a desired set of properties suitable for a given application. In the short term, the traditional "empirical" methods for developing new materials will be complemented to a greater degree by theoretical predictions. In some areas, computer simulation is already used by industry to weed out costly or improbable synthesis routes. Can novel materials with optimized properties be designed by computers? Advances in modelling methods at the atomic level coupled with rapid increases in computer capabilities over the last decade have led scientists to answer this question with a resounding "yes'. The ability to design new materials from quantum mechanical principles with computers is currently one of the fastest growing and most exciting areas of theoretical research in the world. The methods allow scientists to evaluate and prescreen new materials "in silico" (in vitro), rather than through time consuming experimentation. The Materials Genome Project is to pursue the theory of large scale modeling as well as powerful methods to construct new materials, with optimized properties. Indeed, it is the intimate synergy between our ability to predict accurately from quantum theory how atoms can be assembled to form new materials and our capacity to synthesize novel materials atom-by-atom that gives to the Materials Genome Project its extraordinary intellectual vitality. Consequently, in designing new materials through computer simulation, our primary objective is to rapidly screen possible designs to find those few that will enhance the competitiveness of industries or have positive benefits to society. Examples include screening of cancer drugs, advances in catalysis for energy production, design of new alloys and multilayers and processing of semiconductors.
Simulations of defect spin qubits in piezoelectric semiconductors
NASA Astrophysics Data System (ADS)
Seo, Hosung
In recent years, remarkable advances have been reported in the development of defect spin qubits in semiconductors for solid-state quantum information science and quantum metrology. Promising spin qubits include the nitrogen-vacancy center in diamond, dopants in silicon, and the silicon vacancy and divacancy spins in silicon carbide. In this talk, I will highlight some of our recent efforts devoted to defect spin qubits in piezoelectric wide-gap semiconductors for potential applications in mechanical hybrid quantum systems. In particular, I will describe our recent combined theoretical and experimental study on remarkably robust quantum coherence found in the divancancy qubits in silicon carbide. We used a quantum bath model combined with a cluster expansion method to identify the microscopic mechanisms behind the unusually long coherence times of the divacancy spins in SiC. Our study indicates that developing spin qubits in complex crystals with multiple types of atom is a promising route to realize strongly coherent hybrid quantum systems. I will also discuss progress and challenges in computational design of new spin defects for use as qubits in piezoelectric crystals such as AlN and SiC, including a new defect design concept using large metal ion - vacancy complexes. Our first principles calculations include DFT computations using recently developed self-consistent hybrid density functional theory and large-scale many-body GW theory. This work was supported by the National Science Foundation (NSF) through the University of Chicago MRSEC under Award Number DMR-1420709.
Higher Rank ABJM Wilson Loops from Matrix Models
NASA Astrophysics Data System (ADS)
Cookmeyer, Jonathan; Liu, James; Zayas, Leopoldo
2017-01-01
We compute the expectation values of 1/6 supersymmetric Wilson Loops in ABJM theory in higher rank representations. Using standard matrix model techniques, we calculate the expectation value in the rank m fully symmetric and fully antisymmetric representation where m is scaled with N. To leading order, we find agreement with the classical action of D6 and D2 branes in AdS4 ×CP3 respectively. Further, we compute the first subleading order term, which, on the AdS side, makes a prediction for the one-loop effective action of the corresponding D6 and D2 branes. Supported by the National Science Foundation under Grant No. PHY 1559988 and the US Department of Energy under Grant No. DE-SC0007859.
Extended Lagrangian Density Functional Tight-Binding Molecular Dynamics for Molecules and Solids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aradi, Bálint; Niklasson, Anders M. N.; Frauenheim, Thomas
A computationally fast quantum mechanical molecular dynamics scheme using an extended Lagrangian density functional tight-binding formulation has been developed and implemented in the DFTB+ electronic structure program package for simulations of solids and molecular systems. The scheme combines the computational speed of self-consistent density functional tight-binding theory with the efficiency and long-term accuracy of extended Lagrangian Born–Oppenheimer molecular dynamics. Furthermore, for systems without self-consistent charge instabilities, only a single diagonalization or construction of the single-particle density matrix is required in each time step. The molecular dynamics simulation scheme can also be applied to a broad range of problems in materialsmore » science, chemistry, and biology.« less
Extended Lagrangian Density Functional Tight-Binding Molecular Dynamics for Molecules and Solids
Aradi, Bálint; Niklasson, Anders M. N.; Frauenheim, Thomas
2015-06-26
A computationally fast quantum mechanical molecular dynamics scheme using an extended Lagrangian density functional tight-binding formulation has been developed and implemented in the DFTB+ electronic structure program package for simulations of solids and molecular systems. The scheme combines the computational speed of self-consistent density functional tight-binding theory with the efficiency and long-term accuracy of extended Lagrangian Born–Oppenheimer molecular dynamics. Furthermore, for systems without self-consistent charge instabilities, only a single diagonalization or construction of the single-particle density matrix is required in each time step. The molecular dynamics simulation scheme can also be applied to a broad range of problems in materialsmore » science, chemistry, and biology.« less
A Financial Technology Entrepreneurship Program for Computer Science Students
ERIC Educational Resources Information Center
Lawler, James P.; Joseph, Anthony
2011-01-01
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
ERIC Educational Resources Information Center
Menekse, Muhsin
2015-01-01
While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…
NASA Astrophysics Data System (ADS)
Elishakoff, I.; Sarlin, N.
2016-06-01
In this paper we provide a general methodology of analysis and design of systems involving uncertainties. Available experimental data is enclosed by some geometric figures (triangle, rectangle, ellipse, parallelogram, super ellipse) of minimum area. Then these areas are inflated resorting to the Chebyshev inequality in order to take into account the forecasted data. Next step consists in evaluating response of system when uncertainties are confined to one of the above five suitably inflated geometric figures. This step involves a combined theoretical and computational analysis. We evaluate the maximum response of the system subjected to variation of uncertain parameters in each hypothesized region. The results of triangular, interval, ellipsoidal, parallelogram, and super ellipsoidal calculi are compared with the view of identifying the region that leads to minimum of maximum response. That response is identified as a result of the suggested predictive inference. The methodology thus synthesizes probabilistic notion with each of the five calculi. Using the term "pillar" in the title was inspired by the News Release (2013) on according Honda Prize to J. Tinsley Oden, stating, among others, that "Dr. Oden refers to computational science as the "third pillar" of scientific inquiry, standing beside theoretical and experimental science. Computational science serves as a new paradigm for acquiring knowledge and informing decisions important to humankind". Analysis of systems with uncertainties necessitates employment of all three pillars. The analysis is based on the assumption that that the five shapes are each different conservative estimates of the true bounding region. The smallest of the maximal displacements in x and y directions (for a 2D system) therefore provides the closest estimate of the true displacements based on the above assumption.
Acid/base equilibria in clusters and their role in proton exchange membranes: Computational insight
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glezakou, Vanda A; Dupuis, Michel; Mundy, Christopher J
2007-10-24
We describe molecular orbital theory and ab initio molecular dynamics studies of acid/base equilibria of clusters AH:(H 2O) n↔A -:H +(H 2O) n in low hydration regime (n = 1-4), where AH is a model of perfluorinated sulfonic acids, RSO 3H (R = CF 3CF 2), encountered in polymeric electrolyte membranes of fuel cells. Free energy calculations on the neutral and ion pair structures for n = 3 indicate that the two configurations are close in energy and are accessible in the fluctuation dynamics of proton transport. For n = 1,2 the only relevant configuration is the neutral form. Thismore » was verified through ab initio metadynamics simulations. These findings suggest that bases are directly involved in the proton transport at low hydration levels. In addition, the gas phase proton affinity of the model sulfonic acid RSO 3H was found to be comparable to the proton affinity of water. Thus, protonated acids can also play a role in proton transport under low hydration conditions and under high concentration of protons. This work was supported by the Division of Chemical Science, Office of Basic Energy Sciences, US Department of Energy (DOE under Contract DE-AC05-76RL)1830. Computations were performed on computers of the Molecular Interactions and Transformations (MI&T) group and MSCF facility of EMSL, sponsored by US DOE and OBER located at PNNL. This work was benefited from resource of the National Energy Research Scientific Computing Centre, supported by the Office of Science of the US DOE, under Contract No. DE-AC03-76SF00098.« less
Scientists and artists: ""Hey! You got art in my science! You got science on my art
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elfman, Mary E; Hayes, Birchard P; Michel, Kelly D
The pairing of science and art has proven to be a powerful combination since the Renaissance. The combination of these two seemingly disparate disciplines ensured that even complex scientific theories could be explored and effectively communicated to both the subject matter expert and the layman. In modern times, science and art have frequently been considered disjoint, with objectives, philosophies, and perspectives often in direct opposition to each other. However, given the technological advances in computer science and high fidelity 3-D graphics development tools, this marriage of art and science is once again logically complimentary. Art, in the form of computermore » graphics and animation created on supercomputers, has already proven to be a powerful tool for improving scientific research and providing insight into nuclear phenomena. This paper discusses the power of pairing artists with scientists and engineers in order to pursue the possibilities of a widely accessible lightweight, interactive approach. We will use a discussion of photo-realism versus stylization to illuminate the expected beneficial outcome of such collaborations and the societal advantages gained by a non-traditional pa11nering of these two fields.« less
Computer Science | Classification | College of Engineering & Applied
EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer
Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?
ERIC Educational Resources Information Center
Schrock, John Richard
1984-01-01
Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…
How "Scientific" Is Science Education Research?
ERIC Educational Resources Information Center
Lawson, Anton E.
2010-01-01
The research articles published in the "Journal of Research in Science Teaching" in 1965, 1975, 1985, 1995, and in 2005 were surveyed to discover the extent to which they were theory driven. Carey and Smith's theory of the development of science epistemologies was used to frame the study. Specifically their theory posits that science…
Educational Theory and the Social Vision of the Scottish Enlightenment
ERIC Educational Resources Information Center
Hanley, Ryan Patrick
2011-01-01
The Scottish Enlightenment is celebrated for its many contributions to the natural sciences, the social sciences and the moral sciences. But for all this attention, one aspect of the Scottish Enlightenment has been almost entirely neglected: its educational theory. This paper aims to illuminate the relationship between the educational theory of…
Kelemen, Arpad; Vasilakos, Athanasios V; Liang, Yulan
2009-09-01
Comprehensive evaluation of common genetic variations through association of single-nucleotide polymorphism (SNP) structure with common complex disease in the genome-wide scale is currently a hot area in human genome research due to the recent development of the Human Genome Project and HapMap Project. Computational science, which includes computational intelligence (CI), has recently become the third method of scientific enquiry besides theory and experimentation. There have been fast growing interests in developing and applying CI in disease mapping using SNP and haplotype data. Some of the recent studies have demonstrated the promise and importance of CI for common complex diseases in genomic association study using SNP/haplotype data, especially for tackling challenges, such as gene-gene and gene-environment interactions, and the notorious "curse of dimensionality" problem. This review provides coverage of recent developments of CI approaches for complex diseases in genetic association study with SNP/haplotype data.
Identification and addressing reduction-related misconceptions
NASA Astrophysics Data System (ADS)
Gal-Ezer, Judith; Trakhtenbrot, Mark
2016-07-01
Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract technique that involves revealing close non-trivial connections between problems that often seem to have nothing in common. As a result, proper understanding and application of reduction is a serious challenge for students and a source of numerous misconceptions. The main contribution of this paper is detection of such misconceptions, analysis of their roots, and proposing a way to address them in an undergraduate TCC course. Our observations suggest that the main source of the misconceptions is the false intuitive rule "the bigger is a set/problem, the harder it is to solve". Accordingly, we developed a series of exercises for proactive prevention of these misconceptions.
Parra-Cabrera, Cesar; Achille, Clement; Kuhn, Simon; Ameloot, Rob
2018-01-02
Computer-aided fabrication technologies combined with simulation and data processing approaches are changing our way of manufacturing and designing functional objects. Also in the field of catalytic technology and chemical engineering the impact of additive manufacturing, also referred to as 3D printing, is steadily increasing thanks to a rapidly decreasing equipment threshold. Although still in an early stage, the rapid and seamless transition between digital data and physical objects enabled by these fabrication tools will benefit both research and manufacture of reactors and structured catalysts. Additive manufacturing closes the gap between theory and experiment, by enabling accurate fabrication of geometries optimized through computational fluid dynamics and the experimental evaluation of their properties. This review highlights the research using 3D printing and computational modeling as digital tools for the design and fabrication of reactors and structured catalysts. The goal of this contribution is to stimulate interactions at the crossroads of chemistry and materials science on the one hand and digital fabrication and computational modeling on the other.
Helfer, Peter; Shultz, Thomas R
2014-12-01
The widespread availability of calorie-dense food is believed to be a contributing cause of an epidemic of obesity and associated diseases throughout the world. One possible countermeasure is to empower consumers to make healthier food choices with useful nutrition labeling. An important part of this endeavor is to determine the usability of existing and proposed labeling schemes. Here, we report an experiment on how four different labeling schemes affect the speed and nutritional value of food choices. We then apply decision field theory, a leading computational model of human decision making, to simulate the experimental results. The psychology experiment shows that quantitative, single-attribute labeling schemes have greater usability than multiattribute and binary ones, and that they remain effective under moderate time pressure. The computational model simulates these psychological results and provides explanatory insights into them. This work shows how experimental psychology and computational modeling can contribute to the evaluation and improvement of nutrition-labeling schemes. © 2014 New York Academy of Sciences.
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
NASA Astrophysics Data System (ADS)
Matouš, Karel; Geers, Marc G. D.; Kouznetsova, Varvara G.; Gillman, Andrew
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platform in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.
Lexical Predictability During Natural Reading: Effects of Surprisal and Entropy Reduction.
Lowder, Matthew W; Choi, Wonil; Ferreira, Fernanda; Henderson, John M
2018-06-01
What are the effects of word-by-word predictability on sentence processing times during the natural reading of a text? Although information complexity metrics such as surprisal and entropy reduction have been useful in addressing this question, these metrics tend to be estimated using computational language models, which require some degree of commitment to a particular theory of language processing. Taking a different approach, this study implemented a large-scale cumulative cloze task to collect word-by-word predictability data for 40 passages and compute surprisal and entropy reduction values in a theory-neutral manner. A separate group of participants read the same texts while their eye movements were recorded. Results showed that increases in surprisal and entropy reduction were both associated with increases in reading times. Furthermore, these effects did not depend on the global difficulty of the text. The findings suggest that surprisal and entropy reduction independently contribute to variation in reading times, as these metrics seem to capture different aspects of lexical predictability. Copyright © 2018 Cognitive Science Society, Inc.
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matouš, Karel, E-mail: kmatous@nd.edu; Geers, Marc G.D.; Kouznetsova, Varvara G.
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platformmore » in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.« less
Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study
NASA Astrophysics Data System (ADS)
Agosto, Denise E.
Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.
Construction of mutually unbiased bases with cyclic symmetry for qubit systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seyfarth, Ulrich; Ranade, Kedar S.
2011-10-15
For the complete estimation of arbitrary unknown quantum states by measurements, the use of mutually unbiased bases has been well established in theory and experiment for the past 20 years. However, most constructions of these bases make heavy use of abstract algebra and the mathematical theory of finite rings and fields, and no simple and generally accessible construction is available. This is particularly true in the case of a system composed of several qubits, which is arguably the most important case in quantum information science and quantum computation. In this paper, we close this gap by providing a simple andmore » straightforward method for the construction of mutually unbiased bases in the case of a qubit register. We show that our construction is also accessible to experiments, since only Hadamard and controlled-phase gates are needed, which are available in most practical realizations of a quantum computer. Moreover, our scheme possesses the optimal scaling possible, i.e., the number of gates scales only linearly in the number of qubits.« less
Advanced computations in plasma physics
NASA Astrophysics Data System (ADS)
Tang, W. M.
2002-05-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
High-field Transport in Low Symmetry β-Ga2O3 Crystal
NASA Astrophysics Data System (ADS)
Ghosh, Krishnendu; Singisetti, Uttam
High-field carrier transport plays an important role in many disciplines of electronics. Conventional transport theories work well on high-symmetry materials but lacks insight as the crystal symmetry goes down. Newly emerging materials, many of which possess low symmetry, demand more rigorous treatment of charge transport. We will present a comprehensive study of high-field transport using ab initio electron-phonon interaction (EPI) elements in a full-band Monte Carlo (FBMC) algorithm. We use monoclinic β-Ga2O3 as a benchmark low-symmetry material which is also an emerging wide-bandgap semiconductor. β-Ga2O3 has a C2m space group and a 10 atom primitive cell. In this work the EPIs are calculated under density-functional perturbation theory framework. We will focus on the computational challenges arising from many phonon modes and low crystal symmetry. Significant insights will be presented on the details of energy relaxation by the hot electrons mediated by different phonon modes. We will also show the velocity-field curves of electrons in different crystal directions. The authors acknowledge the support from the National Science Foundation Grant (ECCS 1607833). The authors also acknowledge the computing support provided by the Center for Computational Research at the University at Buffalo.
Collider Physics Cosmic Frontier Cosmic Frontier Theory & Computing Detector R&D Electronic Design Theory Seminar Argonne >High Energy Physics Cosmic Frontier Theory & Computing Homepage General Cosmic Frontier Theory & Computing Group led the analysis to begin mapping dark matter. There have
Bounds on the power of proofs and advice in general physical theories.
Lee, Ciarán M; Hoban, Matty J
2016-06-01
Quantum theory presents us with the tools for computational and communication advantages over classical theory. One approach to uncovering the source of these advantages is to determine how computation and communication power vary as quantum theory is replaced by other operationally defined theories from a broad framework of such theories. Such investigations may reveal some of the key physical features required for powerful computation and communication. In this paper, we investigate how simple physical principles bound the power of two different computational paradigms which combine computation and communication in a non-trivial fashion: computation with advice and interactive proof systems. We show that the existence of non-trivial dynamics in a theory implies a bound on the power of computation with advice. Moreover, we provide an explicit example of a theory with no non-trivial dynamics in which the power of computation with advice is unbounded. Finally, we show that the power of simple interactive proof systems in theories where local measurements suffice for tomography is non-trivially bounded. This result provides a proof that [Formula: see text] is contained in [Formula: see text], which does not make use of any uniquely quantum structure-such as the fact that observables correspond to self-adjoint operators-and thus may be of independent interest.
ERIC Educational Resources Information Center
Wang, Lin
2013-01-01
Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…
Using Program Theory-Driven Evaluation Science to Crack the Da Vinci Code
ERIC Educational Resources Information Center
Donaldson, Stewart I.
2005-01-01
Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…
NASA Astrophysics Data System (ADS)
Wang, Qian; Gosik, Kirk; Xing, Sujuan; Jiang, Libo; Sun, Lidan; Chinchilli, Vernon M.; Wu, Rongling
2017-03-01
In its recent issue, Science published breaking global health news, regarding the discovery of pronounced differences of fetal growth rate between countries, even if the mothers of babies were equally given the optimal circumstances [1]. This discovery by a large team of obstetric and gynecological researchers from multiple countries [2] changes a previous view from a large project called INTERGROWTH-21st, which claims that growth trajectories of unborn babies follow a similar form between countries [3]. This new finding provides evidence of the important role of genetic and epigenetic variants (differing between ethnic groups) in determining fetal growth, a process that is generally believed to be affected predominantly by nutrition and socioeconomic status [4].