Sample records for theoretical computer science

  1. Compact Information Representations

    DTIC Science & Technology

    2016-08-02

    applied computer science, and applied math . Within the scope of this proposal, the focus is preliminarily on the fundamental, theoretical research...Science & Technology • Tung-Lung Wu, now Assistant Professor, Dept. of Math and Stat, Mississippi State Univ 2 Papers In this section, we list the papers...computer science, and applied math . Within the scope of this proposal, the focus is preliminarily on the fundamental, theoretical research which lies in

  2. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    NASA Astrophysics Data System (ADS)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  3. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    ERIC Educational Resources Information Center

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  4. Theory-Guided Technology in Computer Science.

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  5. K-16 Computationally Rich Science Education: A Ten-Year Review of the "Journal of Science Education and Technology" (1998-2008)

    ERIC Educational Resources Information Center

    Wofford, Jennifer

    2009-01-01

    Computing is anticipated to have an increasingly expansive impact on the sciences overall, becoming the third, crucial component of a "golden triangle" that includes mathematics and experimental and theoretical science. However, even more true with computing than with math and science, we are not preparing our students for this new reality. It is…

  6. When Life and Learning Do Not Fit: Challenges of Workload and Communication in Introductory Computer Science Online

    ERIC Educational Resources Information Center

    Benda, Klara; Bruckman, Amy; Guzdial, Mark

    2012-01-01

    We present the results of an interview study investigating student experiences in two online introductory computer science courses. Our theoretical approach is situated at the intersection of two research traditions: "distance and adult education research," which tends to be sociologically oriented, and "computer science education…

  7. Finding and defining the natural automata acting in living plants: Toward the synthetic biology for robotics and informatics in vivo.

    PubMed

    Kawano, Tomonori; Bouteau, François; Mancuso, Stefano

    2012-11-01

    The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed.

  8. Finding and defining the natural automata acting in living plants: Toward the synthetic biology for robotics and informatics in vivo

    PubMed Central

    Kawano, Tomonori; Bouteau, François; Mancuso, Stefano

    2012-01-01

    The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed. PMID:23336016

  9. Computational Science in Armenia (Invited Talk)

    NASA Astrophysics Data System (ADS)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  10. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  11. Pedagogy for the Connected Science Classroom: Computer Supported Collaborative Science and the Next Generation Science Standards

    ERIC Educational Resources Information Center

    Foley, Brian J.; Reveles, John M.

    2014-01-01

    The prevalence of computers in the classroom is compelling teachers to develop new instructional skills. This paper provides a theoretical perspective on an innovative pedagogical approach to science teaching that takes advantage of technology to create a connected classroom. In the connected classroom, students collaborate and share ideas in…

  12. The Ulam Index: Methods of Theoretical Computer Science Help in Identifying Chemical Substances

    NASA Technical Reports Server (NTRS)

    Beltran, Adriana; Salvador, James

    1997-01-01

    In this paper, we show how methods developed for solving a theoretical computer problem of graph isomorphism are used in structural chemistry. We also discuss potential applications of these methods to exobiology: the search for life outside Earth.

  13. Theoretical computer science and the natural sciences

    NASA Astrophysics Data System (ADS)

    Marchal, Bruno

    2005-12-01

    I present some fundamental theorems in computer science and illustrate their relevance in Biology and Physics. I do not assume prerequisites in mathematics or computer science beyond the set N of natural numbers, functions from N to N, the use of some notational conveniences to describe functions, and at some point, a minimal amount of linear algebra and logic. I start with Cantor's transcendental proof by diagonalization of the non enumerability of the collection of functions from natural numbers to the natural numbers. I explain why this proof is not entirely convincing and show how, by restricting the notion of function in terms of discrete well defined processes, we are led to the non algorithmic enumerability of the computable functions, but also-through Church's thesis-to the algorithmic enumerability of partial computable functions. Such a notion of function constitutes, with respect to our purpose, a crucial generalization of that concept. This will make easy to justify deep and astonishing (counter-intuitive) incompleteness results about computers and similar machines. The modified Cantor diagonalization will provide a theory of concrete self-reference and I illustrate it by pointing toward an elementary theory of self-reproduction-in the Amoeba's way-and cellular self-regeneration-in the flatworm Planaria's way. To make it easier, I introduce a very simple and powerful formal system known as the Schoenfinkel-Curry combinators. I will use the combinators to illustrate in a more concrete way the notion introduced above. The combinators, thanks to their low-level fine grained design, will also make it possible to make a rough but hopefully illuminating description of the main lessons gained by the careful observation of nature, and to describe some new relations, which should exist between computer science, the science of life and the science of inert matter, once some philosophical, if not theological, hypotheses are made in the cognitive sciences. In the last section, I come back to self-reference and I give an exposition of its modal logics. This is used to show that theoretical computer science makes those “philosophical hypotheses” in theoretical cognitive science experimentally and mathematically testable.

  14. Research in progress and other activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  15. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  16. Computing and the social organization of academic work

    NASA Astrophysics Data System (ADS)

    Shields, Mark A.; Graves, William; Nyce, James M.

    1992-12-01

    This article discusses the academic computing movement during the 1980s. We focus on the Faculty Workstations Project at Brown University, where major computing initiatives were undertaken during the 1980s. Six departments are compared: chemistry, cognitive and linguistic sciences, geology, music, neural science, and sociology. We discuss the theoretical implications of our study for conceptualizing the relationship of computing to academic work.

  17. ICASE

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in the areas of (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving Langley facilities and scientists; and (4) computer science.

  18. Slime mould biotechnology

    NASA Astrophysics Data System (ADS)

    Mayne, Richard

    2015-03-01

    Slime mould computing is an inherently multi-disciplinary subfield of unconventional computing that draws upon aspects of not only theoretical computer science and electronics, but also the natural sciences. This chapter focuses on the biology of slime moulds and expounds the viewpoint that a deep, intuitive understanding of slime mould life processes is a fundamental requirement for understanding -- and, hence, harnessing -- the incredible behaviour patterns we may characterise as "computation"...

  19. Promoting Technology-Assisted Active Learning in Computer Science Education

    ERIC Educational Resources Information Center

    Gao, Jinzhu; Hargis, Jace

    2010-01-01

    This paper describes specific active learning strategies for teaching computer science, integrating both instructional technologies and non-technology-based strategies shown to be effective in the literature. The theoretical learning components addressed include an intentional method to help students build metacognitive abilities, as well as…

  20. Quantum Computation

    NASA Astrophysics Data System (ADS)

    Aharonov, Dorit

    In the last few years, theoretical study of quantum systems serving as computational devices has achieved tremendous progress. We now have strong theoretical evidence that quantum computers, if built, might be used as a dramatically powerful computational tool, capable of performing tasks which seem intractable for classical computers. This review is about to tell the story of theoretical quantum computation. I l out the developing topic of experimental realizations of the model, and neglected other closely related topics which are quantum information and quantum communication. As a result of narrowing the scope of this paper, I hope it has gained the benefit of being an almost self contained introduction to the exciting field of quantum computation. The review begins with background on theoretical computer science, Turing machines and Boolean circuits. In light of these models, I define quantum computers, and discuss the issue of universal quantum gates. Quantum algorithms, including Shor's factorization algorithm and Grover's algorithm for searching databases, are explained. I will devote much attention to understanding what the origins of the quantum computational power are, and what the limits of this power are. Finally, I describe the recent theoretical results which show that quantum computers maintain their complexity power even in the presence of noise, inaccuracies and finite precision. This question cannot be separated from that of quantum complexity because any realistic model will inevitably be subjected to such inaccuracies. I tried to put all results in their context, asking what the implications to other issues in computer science and physics are. In the end of this review, I make these connections explicit by discussing the possible implications of quantum computation on fundamental physical questions such as the transition from quantum to classical physics.

  1. Statistical mechanics of complex neural systems and high dimensional data

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-03-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

  2. Evaluating the Theoretic Adequacy and Applied Potential of Computational Models of the Spacing Effect.

    PubMed

    Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael

    2018-06-01

    The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. Copyright © 2018 Cognitive Science Society, Inc.

  3. An Investigation of the Artifacts and Process of Constructing Computers Games about Environmental Science in a Fifth Grade Classroom

    ERIC Educational Resources Information Center

    Baytak, Ahmet; Land, Susan M.

    2011-01-01

    This study employed a case study design (Yin, "Case study research, design and methods," 2009) to investigate the processes used by 5th graders to design and develop computer games within the context of their environmental science unit, using the theoretical framework of "constructionism." Ten fifth graders designed computer games using "Scratch"…

  4. BIOCOMPUTATION: some history and prospects.

    PubMed

    Cull, Paul

    2013-06-01

    At first glance, biology and computer science are diametrically opposed sciences. Biology deals with carbon based life forms shaped by evolution and natural selection. Computer Science deals with electronic machines designed by engineers and guided by mathematical algorithms. In this brief paper, we review biologically inspired computing. We discuss several models of computation which have arisen from various biological studies. We show what these have in common, and conjecture how biology can still suggest answers and models for the next generation of computing problems. We discuss computation and argue that these biologically inspired models do not extend the theoretical limits on computation. We suggest that, in practice, biological models may give more succinct representations of various problems, and we mention a few cases in which biological models have proved useful. We also discuss the reciprocal impact of computer science on biology and cite a few significant contributions to biological science. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Students' Explanations in Complex Learning of Disciplinary Programming

    ERIC Educational Resources Information Center

    Vieira, Camilo

    2016-01-01

    Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or…

  6. In Praise of Numerical Computation

    NASA Astrophysics Data System (ADS)

    Yap, Chee K.

    Theoretical Computer Science has developed an almost exclusively discrete/algebraic persona. We have effectively shut ourselves off from half of the world of computing: a host of problems in Computational Science & Engineering (CS&E) are defined on the continuum, and, for them, the discrete viewpoint is inadequate. The computational techniques in such problems are well-known to numerical analysis and applied mathematics, but are rarely discussed in theoretical algorithms: iteration, subdivision and approximation. By various case studies, I will indicate how our discrete/algebraic view of computing has many shortcomings in CS&E. We want embrace the continuous/analytic view, but in a new synthesis with the discrete/algebraic view. I will suggest a pathway, by way of an exact numerical model of computation, that allows us to incorporate iteration and approximation into our algorithms’ design. Some recent results give a peek into how this view of algorithmic development might look like, and its distinctive form suggests the name “numerical computational geometry” for such activities.

  7. Reducing Abstraction in High School Computer Science Education: The Case of Definition, Implementation, and Use of Abstract Data Types

    ERIC Educational Resources Information Center

    Sakhnini, Victoria; Hazzan, Orit

    2008-01-01

    The research presented in this article deals with the difficulties and mental processes involved in the definition, implementation, and use of abstract data types encountered by 12th grade advanced-level computer science students. Research findings are interpreted within the theoretical framework of "reducing abstraction" [Hazzan 1999]. The…

  8. Computational Materials Science | Materials Science | NREL

    Science.gov Websites

    of water splitting and fuel cells Nanoparticles for thermal storage New Materials for High-Capacity Theoretical Methodologies for Studying Complex Materials Contact Stephan Lany Staff Scientist Dr. Lany is a

  9. Towards systemic theories in biological psychiatry.

    PubMed

    Bender, W; Albus, M; Möller, H-J; Tretter, F

    2006-02-01

    Although still rather controversial, empirical data on the neurobiology of schizophrenia have reached a degree of complexity that makes it hard to obtain a coherent picture of the malfunctions of the brain in schizophrenia. Theoretical neuropsychiatry should therefore use the tools of theoretical sciences like cybernetics, informatics, computational neuroscience or systems science. The methodology of systems science permits the modeling of complex dynamic nonlinear systems. Such procedures might help us to understand brain functions and the disorders and actions of psychiatric drugs better.

  10. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  11. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  12. Experiments in computing: a survey.

    PubMed

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  13. Topics in Computational Learning Theory and Graph Algorithms.

    ERIC Educational Resources Information Center

    Board, Raymond Acton

    This thesis addresses problems from two areas of theoretical computer science. The first area is that of computational learning theory, which is the study of the phenomenon of concept learning using formal mathematical models. The goal of computational learning theory is to investigate learning in a rigorous manner through the use of techniques…

  14. Linguistics, Cognitive Science and the Undergraduate Curriculum. Linguistics in the Undergraduate Curriculum, Appendix 4-I.

    ERIC Educational Resources Information Center

    Feinstein, Mark; Stillings, Neil

    Cognitive science has recently emerged as a new interdisciplinary field incorporating parts of psychology, computer science, philosophy, neuroscience, and linguistics. Its goal is to bring the theoretical and methodological resources of the contributing disciplines to bear on an integrated investigation of thought, meaning, language, perception,…

  15. How the Theory of Computing Can Help in Space Exploration

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik; Longpre, Luc

    1997-01-01

    The opening of the NASA Pan American Center for Environmental and Earth Sciences (PACES) at the University of Texas at El Paso made it possible to organize the student Center for Theoretical Research and its Applications in Computer Science (TRACS). In this abstract, we briefly describe the main NASA-related research directions of the TRACS center, and give an overview of the preliminary results of student research.

  16. Computational psychiatry

    PubMed Central

    Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter

    2013-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032

  17. The Real-World Connection.

    ERIC Educational Resources Information Center

    Estes, Charles R.

    1994-01-01

    Discusses theoretical versus applied science and the use of the scientific method for analysis of social issues. Topics addressed include the use of simulation and modeling; the growth in computer power, including nanotechnology; distributed computing; self-evolving programs; spiritual matters; human engineering, i.e., molding individuals;…

  18. Asking Research Questions: Theoretical Presuppositions

    ERIC Educational Resources Information Center

    Tenenberg, Josh

    2014-01-01

    Asking significant research questions is a crucial aspect of building a research foundation in computer science (CS) education. In this article, I argue that the questions that we ask are shaped by internalized theoretical presuppositions about how the social and behavioral worlds operate. And although such presuppositions are essential in making…

  19. Cumulative reports and publications

    NASA Technical Reports Server (NTRS)

    1993-01-01

    A complete list of Institute for Computer Applications in Science and Engineering (ICASE) reports are listed. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available. The major categories of the current ICASE research program are: applied and numerical mathematics, including numerical analysis and algorithm development; theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and computer science.

  20. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    NASA Astrophysics Data System (ADS)

    Christiansen, Henning

    2004-09-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural science or humanities. It has been developed for a course that integrates theoretical material on computer languages and abstract machines with practical programming techniques. Prolog used as meta-language for describing language issues is the central instrument in the approach: Formal descriptions become running prototypes that are easy and appealing to test and modify, and can be extended into analyzers, interpreters, and tools such as tracers and debuggers. Experience shows a high learning curve, especially when the principles are extended into a learning-by-doing approach having the students to develop such descriptions themselves from an informal introduction.

  1. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  2. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  3. Michael Bahl | NREL

    Science.gov Websites

    theoretical mathematics, Western New England University, Springfield, MA B.S. in computer science, University Prior Work Experience Founder, Mobile Makes Sense Senior IT Specialist, IBM Graduate Teaching Assistant

  4. An Enduring Dialogue between Computational and Empirical Vision.

    PubMed

    Martinez-Conde, Susana; Macknik, Stephen L; Heeger, David J

    2018-04-01

    In the late 1970s, key discoveries in neurophysiology, psychophysics, computer vision, and image processing had reached a tipping point that would shape visual science for decades to come. David Marr and Ellen Hildreth's 'Theory of edge detection', published in 1980, set out to integrate the newly available wealth of data from behavioral, physiological, and computational approaches in a unifying theory. Although their work had wide and enduring ramifications, their most important contribution may have been to consolidate the foundations of the ongoing dialogue between theoretical and empirical vision science. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. New Theoretical Frameworks for Machine Learning

    DTIC Science & Technology

    2008-09-15

    New York, 1974. 6.3 [52] G.M. Benedek and A. Itai . Learnability by fixed distributions. In Proc. 1st Workshop Computat. Learning Theory, pages 80–90...1988. 3.4.3 [53] G.M. Benedek and A. Itai . Learnability with respect to a fixed distribution. Theoretical Computer Science, 86:377–389, 1991. 2.1, 2.1.1

  6. Using Intelligent Tutoring Design Principles To Integrate Cognitive Theory into Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Orey, Michael A.; Nelson, Wayne A.

    Arguing that the evolution of intelligent tutoring systems better reflects the recent theoretical developments of cognitive science than traditional computer-based instruction (CBI), this paper describes a general model for an intelligent tutoring system and suggests ways to improve CBI using design principles derived from research in cognitive…

  7. Opportunities for Research on the Organizational Impact of School Computers. Technical-Report-No. 7.

    ERIC Educational Resources Information Center

    Newman, Denis

    As computers are acquired in greater numbers in schools, their impact on the social organization of instruction increasingly becomes an issue for research. Developments in the cognitive science of instruction, drawing on sociohistorical theory, provide researchers with an appropriate theoretical approach to cultural tools and cognitive change,…

  8. Computer Simulation of Compression and Energy Release upon Laser Irradiation of Cylindrically Symmetric Target

    NASA Astrophysics Data System (ADS)

    Kuzenov, V. V.

    2017-12-01

    The paper is devoted to the theoretical and computational study of compression and energy release for magneto-inertial plasma confinement. This approach makes it possible to create new high-density plasma sources, apply them in materials science experiments, and use them in promising areas of power engineering.

  9. Design and Development of a Web-Based Interactive Software Tool for Teaching Operating Systems

    ERIC Educational Resources Information Center

    Garmpis, Aristogiannis

    2011-01-01

    Operating Systems (OS) is an important and mandatory discipline in many Computer Science, Information Systems and Computer Engineering curricula. Some of its topics require a careful and detailed explanation from the instructor as they often involve theoretical concepts and somewhat complex mechanisms, demanding a certain degree of abstraction…

  10. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    ERIC Educational Resources Information Center

    Christiansen, Henning

    2004-01-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural…

  11. Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science

    NASA Technical Reports Server (NTRS)

    Young, Gerald W.; Clemons, Curtis B.

    2004-01-01

    The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.

  12. Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition

    NASA Astrophysics Data System (ADS)

    Fitch, W. Tecumseh

    2014-09-01

    Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology.

  13. Toward a computational framework for cognitive biology: unifying approaches from cognitive neuroscience and comparative cognition.

    PubMed

    Fitch, W Tecumseh

    2014-09-01

    Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology. Copyright © 2014. Published by Elsevier B.V.

  14. Non-parallel processing: Gendered attrition in academic computer science

    NASA Astrophysics Data System (ADS)

    Cohoon, Joanne Louise Mcgrath

    2000-10-01

    This dissertation addresses the issue of disproportionate female attrition from computer science as an instance of gender segregation in higher education. By adopting a theoretical framework from organizational sociology, it demonstrates that the characteristics and processes of computer science departments strongly influence female retention. The empirical data identifies conditions under which women are retained in the computer science major at comparable rates to men. The research for this dissertation began with interviews of students, faculty, and chairpersons from five computer science departments. These exploratory interviews led to a survey of faculty and chairpersons at computer science and biology departments in Virginia. The data from these surveys are used in comparisons of the computer science and biology disciplines, and for statistical analyses that identify which departmental characteristics promote equal attrition for male and female undergraduates in computer science. This three-pronged methodological approach of interviews, discipline comparisons, and statistical analyses shows that departmental variation in gendered attrition rates can be explained largely by access to opportunity, relative numbers, and other characteristics of the learning environment. Using these concepts, this research identifies nine factors that affect the differential attrition of women from CS departments. These factors are: (1) The gender composition of enrolled students and faculty; (2) Faculty turnover; (3) Institutional support for the department; (4) Preferential attitudes toward female students; (5) Mentoring and supervising by faculty; (6) The local job market, starting salaries, and competitiveness of graduates; (7) Emphasis on teaching; and (8) Joint efforts for student success. This work contributes to our understanding of the gender segregation process in higher education. In addition, it contributes information that can lead to effective solutions for an economically significant issue in modern American society---gender equality in computer science.

  15. A Decomposition Theorem for Finite Automata.

    ERIC Educational Resources Information Center

    Santa Coloma, Teresa L.; Tucci, Ralph P.

    1990-01-01

    Described is automata theory which is a branch of theoretical computer science. A decomposition theorem is presented that is easier than the Krohn-Rhodes theorem. Included are the definitions, the theorem, and a proof. (KR)

  16. Solid State Cooling with Advanced Oxide Materials

    DTIC Science & Technology

    2014-06-03

    Department of Materials Science and Engineering , Department of Mechanical Science and Engineering , and Department of Electrical and Computer... Engineering University of Illinois, Urbana-Champaign Program Overview The focus of this program was to probe electro-(magneto-)caloric materials for... engineering systems by developing theoretical and experimental approaches to study thermodynamic properties and effects in thin film systems. Despite

  17. Extending Landauer's bound from bit erasure to arbitrary computation

    NASA Astrophysics Data System (ADS)

    Wolpert, David

    The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.

  18. CPE--A New Perspective: The Impact of the Technology Revolution. Proceedings of the Computer Performance Evaluation Users Group Meeting (19th, San Francisco, California, October 25-28, 1983). Final Report. Reports on Computer Science and Technology.

    ERIC Educational Resources Information Center

    Mobray, Deborah, Ed.

    Papers on local area networks (LANs), modelling techniques, software improvement, capacity planning, software engineering, microcomputers and end user computing, cost accounting and chargeback, configuration and performance management, and benchmarking presented at this conference include: (1) "Theoretical Performance Analysis of Virtual…

  19. Higher Inductive Types as Homotopy-Initial Algebras

    DTIC Science & Technology

    2016-08-01

    Higher Inductive Types as Homotopy-Initial Algebras Kristina Sojakova CMU-CS-16-125 August 2016 School of Computer Science Carnegie Mellon University...talk at the Workshop on Logic, Language, Information and Computation (WoLLIC 2011). 1, 2.1 [38] M. Warren. Homotopy-Theoretic Aspects of Constructive Type Theory. PhD thesis, Carnegie Mellon University, 2008. 1 143

  20. Project : semi-autonomous parking for enhanced safety and efficiency.

    DOT National Transportation Integrated Search

    2016-04-01

    Index coding, a coding formulation traditionally analyzed in the theoretical computer science and : information theory communities, has received considerable attention in recent years due to its value in : wireless communications and networking probl...

  1. Need Assessment of Computer Science and Engineering Graduates

    NASA Astrophysics Data System (ADS)

    Surakka, Sami; Malmi, Lauri

    2005-06-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a university of technology in Finland. The necessity of different subjects for the advanced studies (years 3? ?5) and for working life was assessed using four content analyses: (a) the course catalog of the institution where this study was carried out, (b) employment reports that were attached to the applications for internship credits, (c) masters theses, and (d) job advertisements in a newspaper. The results of the study imply that the necessity of physics for the advanced study and work was very low compared to the extent to which it was studied. On the other hand, the necessity for mathematics was moderate, and it had remained quite steady during the period 1989? ?2002. The most necessary computer science topic was programming. Also telecommunications and networking was needed often, whereas theoretical computer science was needed quite rarely.

  2. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Weitao

    This Special Topic Issue on the Advances in Density Functional Theory, published as a celebration of the fifty years of density functional theory, contains a retrospective article, a perspective article, and a collection of original research articles that showcase recent theoretical advances in the field. It provides a timely discussion reflecting a cross section of our understanding, and the theoretical and computational developments, which have significant implications in broad areas of sciences and engineering.

  4. Aeronautical engineering: A continuing bibliography with indexes (supplement 316)

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This bibliography lists 413 reports, articles, and other documents introduced into the NASA scientific and technical information system in April 1995. Subject coverage includes: aeronautics; mathematical and computer sciences; chemistry and material sciences; geosciences; design, construction and testing of aircraft and aircraft engines; aircraft components, equipment, and systems; ground support systems; and theoretical and applied aspects of aerodynamics and general fluid dynamics.

  5. Mathematical challenges from theoretical/computational chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembledmore » a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.« less

  6. Computational and theoretical modeling of pH and flow effects on the early-stage non-equilibrium self-assembly of optoelectronic peptides

    NASA Astrophysics Data System (ADS)

    Mansbach, Rachael; Ferguson, Andrew

    Self-assembling π-conjugated peptides are attractive candidates for the fabrication of bioelectronic materials possessing optoelectronic properties due to electron delocalization over the conjugated peptide groups. We present a computational and theoretical study of an experimentally-realized optoelectronic peptide that displays triggerable assembly in low pH to resolve the microscopic effects of flow and pH on the non-equilibrium morphology and kinetics of assembly. Using a combination of molecular dynamics simulations and hydrodynamic modeling, we quantify the time and length scales at which convective flows employed in directed assembly compete with microscopic diffusion to influence assembly. We also show that there is a critical pH below which aggregation proceeds irreversibly, and quantify the relationship between pH, charge density, and aggregate size. Our work provides new fundamental understanding of pH and flow of non-equilibrium π-conjugated peptide assembly, and lays the groundwork for the rational manipulation of environmental conditions and peptide chemistry to control assembly and the attendant emergent optoelectronic properties. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, under Award # DE-SC0011847, and by the Computational Science and Engineering Fellowship from the University of Illinois at Urbana-Champaign.

  7. Quantum computation for solving linear systems

    NASA Astrophysics Data System (ADS)

    Cao, Yudong

    Quantum computation is a subject born out of the combination between physics and computer science. It studies how the laws of quantum mechanics can be exploited to perform computations much more efficiently than current computers (termed classical computers as oppose to quantum computers). The thesis starts by introducing ideas from quantum physics and theoretical computer science and based on these ideas, introducing the basic concepts in quantum computing. These introductory discussions are intended for non-specialists to obtain the essential knowledge needed for understanding the new results presented in the subsequent chapters. After introducing the basics of quantum computing, we focus on the recently proposed quantum algorithm for linear systems. The new results include i) special instances of quantum circuits that can be implemented using current experimental resources; ii) detailed quantum algorithms that are suitable for a broader class of linear systems. We show that for some particular problems the quantum algorithm is able to achieve exponential speedup over their classical counterparts.

  8. Sandia National Laboratories: Research: Materials Science

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New research. Research Our research uses Sandia's experimental, theoretical, and computational capabilities to

  9. EDITORIAL: Computational materials science Computational materials science

    NASA Astrophysics Data System (ADS)

    Kahl, Gerhard; Kresse, Georg

    2011-10-01

    Special issue in honour of Jürgen Hafner On 30 September 2010, Jürgen Hafner, one of the most prominent and influential members within the solid state community, retired. His remarkably broad scientific oeuvre has made him one of the founding fathers of modern computational materials science: more than 600 scientific publications, numerous contributions to books, and a highly cited monograph, which has become a standard reference in the theory of metals, witness not only the remarkable productivity of Jürgen Hafner but also his impact in theoretical solid state physics. In an effort to duly acknowledge Jürgen Hafner's lasting impact in this field, a Festsymposium was held on 27-29 September 2010 at the Universität Wien. The organizers of this symposium (and authors of this editorial) are proud to say that a large number of highly renowned scientists in theoretical condensed matter theory—co-workers, friends and students—accepted the invitation to this celebration of Hafner's jubilee. Some of these speakers also followed our invitation to submit their contribution to this Festschrift, published in Journal of Physics: Condensed Matter, a journal which Jürgen Hafner served in 2000-2003 and 2003-2006 as a member of the Advisory Editorial Board and member of the Executive Board, respectively. In the subsequent article, Volker Heine, friend and co-worker of Jürgen Hafner over many decades, gives an account of Hafner's impact in the field of theoretical condensed matter physics. Computational materials science contents Theoretical study of structural, mechanical and spectroscopic properties of boehmite (γ-AlOOH) D Tunega, H Pašalić, M H Gerzabek and H Lischka Ethylene epoxidation catalyzed by chlorine-promoted silver oxide M O Ozbek, I Onal and R A Van Santen First-principles study of Cu2ZnSnS4 and the related band offsets for photovoltaic applicationsA Nagoya, R Asahi and G Kresse Renormalization group study of random quantum magnetsIstván A Kovács and Ferenc Iglói Ordering effects in disordered systems: the Au-Si systemN Jakse, T L T Nguyen and A Pasturel On the stability of Archimedean tilings formed by patchy particlesMoritz Antlanger, Günther Doppelbauer and Gerhard Kahl

  10. Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA

    PubMed Central

    Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.

    2017-01-01

    The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099

  11. XXV IUPAP Conference on Computational Physics (CCP2013): Preface

    NASA Astrophysics Data System (ADS)

    2014-05-01

    XXV IUPAP Conference on Computational Physics (CCP2013) was held from 20-24 August 2013 at the Russian Academy of Sciences in Moscow, Russia. The annual Conferences on Computational Physics (CCP) present an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas. The CCP series aims to draw computational scientists from around the world and to stimulate interdisciplinary discussion and collaboration by putting together researchers interested in various fields of computational science. It is organized under the auspices of the International Union of Pure and Applied Physics and has been in existence since 1989. The CCP series alternates between Europe, America and Asia-Pacific. The conferences are traditionally supported by European Physical Society and American Physical Society. This year the Conference host was Landau Institute for Theoretical Physics. The Conference contained 142 presentations, and, in particular, 11 plenary talks with comprehensive reviews from airbursts to many-electron systems. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), European Physical Society (EPS), Division of Computational Physics of American Physical Society (DCOMP/APS), Russian Foundation for Basic Research, Department of Physical Sciences of Russian Academy of Sciences, RSC Group company. Further conference information and images from the conference are available in the pdf.

  12. Aeronautical engineering: A continuing bibliography with indexes (supplement 267)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This bibliography lists 661 reports, articles, and other documents introduced into the NASA scientific and technical information system in June, 1991. Subject coverage includes design, construction and testing of aircraft and aircraft engines; aircraft components, equipment and systems; ground support systems; theoretical and applied aspects of aerodynamics and general fluid dynamics; electrical engineering; aircraft control; remote sensing; computer sciences; nuclear physics; and social sciences.

  13. Instructional support and implementation structure during elementary teachers' science education simulation use

    NASA Astrophysics Data System (ADS)

    Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.

    2016-07-01

    This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results indicated teachers used a one-to-one student-to-computer ratio most often either during class-wide individual computer use or during a rotating station structure. Worksheets, general support, and peer collaboration were the most common forms of instructional support. The least common instructional support forms included lesson pacing, initial play, and a closure discussion. Students' simulation use was supported in the fewest ways during a rotating station structure. Results suggest that simulation professional development with elementary teachers needs to explicitly focus on implementation structures and instructional support to enhance participants' pedagogical knowledge and improve instructional simulation use. In addition, research is needed to provide theoretical explanations for the observed patterns that should subsequently be addressed in supporting teachers' instructional simulation use during professional development or in teacher preparation programs.

  14. Sandia National Laboratories: Careers: Materials Science

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New Sandia's experimental, theoretical, and computational capabilities to establish the state of the art in

  15. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.

    PubMed

    Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya

    2018-06-17

    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.

  16. CSM research: Methods and application studies

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    1989-01-01

    Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.

  17. Introduction to the focus issue: fifty years of chaos: applied and theoretical.

    PubMed

    Hikihara, Takashi; Holmes, Philip; Kambe, Tsutomu; Rega, Giuseppe

    2012-12-01

    The discovery of deterministic chaos in the late nineteenth century, its subsequent study, and the development of mathematical and computational methods for its analysis have substantially influenced the sciences. Chaos is, however, only one phenomenon in the larger area of dynamical systems theory. This Focus Issue collects 13 papers, from authors and research groups representing the mathematical, physical, and biological sciences, that were presented at a symposium held at Kyoto University from November 28 to December 2, 2011. The symposium, sponsored by the International Union of Theoretical and Applied Mechanics, was called 50 Years of Chaos: Applied and Theoretical. Following some historical remarks to provide a background for the last 50 years, and for chaos, this Introduction surveys the papers and identifies some common themes that appear in them and in the theory of dynamical systems.

  18. Information Architecture: Notes toward a New Curriculum.

    ERIC Educational Resources Information Center

    Latham, Don

    2002-01-01

    Considers the evolution of information architectures as a field of professional education. Topics include the need for an interdisciplinary approach; balancing practical skills with theoretical concepts; and key content areas, including information organization, graphic design, computer science, user and usability studies, and communication.…

  19. Stress Computations for Nearly Incompressible Materials

    DTIC Science & Technology

    1988-04-01

    Louis Ivo Babugka Research Professor, Institute for Physical Science and Technology University of Maryland, College Park Bidar K. Chayapathy Research...for Testing and Materials, Philadelphia, pp. 101-124 (1987). [13] Szab6, B. A., PROBE: Theoretical Manual, Release 1.0, Noetic Technologies Corp., St

  20. Artificial Intelligence and Expert Systems.

    ERIC Educational Resources Information Center

    Lawlor, Joseph

    Artificial intelligence (AI) is the field of scientific inquiry concerned with designing machine systems that can simulate human mental processes. The field draws upon theoretical constructs from a wide variety of disciplines, including mathematics, psychology, linguistics, neurophysiology, computer science, and electronic engineering. Some of the…

  1. Compiling with Types

    DTIC Science & Technology

    1995-12-01

    ogy and Theoretical Computer Science 1993, Bombay, New York, 1993. Springer-Verlag. Extended abstract. [17] E. Biagioni . Sequence types for functional...FOX-95-06. [18] E. Biagioni , R. Harper, P. Lee, and B. Milnes. Signatures for a network protocol stack: A systems application of Standard ML. In ACM

  2. Parallel and Distributed Computing Combinatorial Algorithms

    DTIC Science & Technology

    1993-10-01

    Discrete Math , 1991. In press. [551 L. Finkelstein, D. Kleitman, and T. Leighton. Applying the classification theorem for finite simple groups to minimize...Mathematics (in press). [741 L. Heath, T. Leighton, and A. Rosenberg. Comparing queue and stack layouts. SIAM J Discrete Math , 5(3):398-412, August 1992...line can meet only a few. DIMA CS Series in Discrete Math and Theoretical Computer Science, 9, 1993. Publications, Presentations and Theses Supported

  3. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  4. Building Cognition: The Construction of Computational Representations for Scientific Discovery.

    PubMed

    Chandrasekharan, Sanjay; Nersessian, Nancy J

    2015-11-01

    Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a theoretical analysis of the cognitive roles such representations play, based on an ethnographic study of the building of computational models in a systems biology laboratory. Specifically, we focus on a case of model-building by an engineer that led to a remarkable discovery in basic bioscience. Accounting for such discoveries requires a distributed cognition (DC) analysis, as DC focuses on the roles played by external representations in cognitive processes. However, DC analyses by and large have not examined scientific discovery, and they mostly focus on memory offloading, particularly how the use of existing external representations changes the nature of cognitive tasks. In contrast, we study discovery processes and argue that discoveries emerge from the processes of building the computational representation. The building process integrates manipulations in imagination and in the representation, creating a coupled cognitive system of model and modeler, where the model is incorporated into the modeler's imagination. This account extends DC significantly, and we present some of the theoretical and application implications of this extended account. Copyright © 2014 Cognitive Science Society, Inc.

  5. Glitch game testers: The design and study of a learning environment for computational production with young African American males

    NASA Astrophysics Data System (ADS)

    DiSalvo, Elizabeth Betsy

    The implementation of a learning environment for young African American males, called the Glitch Game Testers, was launched in 2009. The development of this program was based on formative work that looked at the contrasting use of digital games between young African American males and individuals who chose to become computer science majors. Through analysis of cultural values and digital game play practices, the program was designed to intertwine authentic game development practices and computer science learning. The resulting program employed 25 African American male high school students to test pre-release digital games full-time in the summer and part-time in the school year, with an hour of each day dedicated to learning introductory computer science. Outcomes for persisting in computer science education are remarkable; of the 16 participants who had graduated from high school as of 2012, 12 have gone on to school in computing-related majors. These outcomes, and the participants' enthusiasm for engaging in computing, are in sharp contrast to the crisis in African American male education and learning motivation. The research presented in this dissertation discusses the formative research that shaped the design of Glitch, the evaluation of the implementation of Glitch, and a theoretical investigation of the way in which participants navigated conflicting motivations in learning environments.

  6. Rocket Scientist for a Day: Investigating Alternatives for Chemical Propulsion

    ERIC Educational Resources Information Center

    Angelin, Marcus; Rahm, Martin; Gabrielsson, Erik; Gumaelius, Lena

    2012-01-01

    This laboratory experiment introduces rocket science from a chemistry perspective. The focus is set on chemical propulsion, including its environmental impact and future development. By combining lecture-based teaching with practical, theoretical, and computational exercises, the students get to evaluate different propellant alternatives. To…

  7. Theoretical and Experimental Investigation of Opinion Dynamics in Small Social Networks

    DTIC Science & Technology

    2016-07-01

    Sciences, Social Informatics and Telecommunications Engineering 2013 96 M. Gabbay described. Section 4 illustrates the application of the methodology...group of cyber terrorists has already gained access to multiple computers. The attack will attempt to disrupt and destroy a large oil refinery; at

  8. Neural Information Processing in Cognition: We Start to Understand the Orchestra, but Where is the Conductor?

    PubMed Central

    Palm, Günther

    2016-01-01

    Research in neural information processing has been successful in the past, providing useful approaches both to practical problems in computer science and to computational models in neuroscience. Recent developments in the area of cognitive neuroscience present new challenges for a computational or theoretical understanding asking for neural information processing models that fulfill criteria or constraints from cognitive psychology, neuroscience and computational efficiency. The most important of these criteria for the evaluation of present and future contributions to this new emerging field are listed at the end of this article. PMID:26858632

  9. Progress in Earth System Modeling since the ENIAC Calculation

    NASA Astrophysics Data System (ADS)

    Fung, I.

    2009-05-01

    The success of the first numerical weather prediction experiment on the ENIAC computer in 1950 was hinged on the expansion of the meteorological observing network, which led to theoretical advances in atmospheric dynamics and subsequently the implementation of the simplified equations on the computer. This paper briefly reviews the progress in Earth System Modeling and climate observations, and suggests a strategy to sustain and expand the observations needed to advance climate science and prediction.

  10. The Complexity of Primary Care Psychology: Theoretical Foundations.

    PubMed

    Smit, E H; Derksen, J J L

    2015-07-01

    How does primary care psychology deal with organized complexity? Has it escaped Newtonian science? Has it, as Weaver (1991) suggests, found a way to 'manage problems with many interrelated factors that cannot be dealt by statistical techniques'? Computer simulations and mathematical models in psychology are ongoing positive developments in the study of complex systems. However, the theoretical development of complex systems in psychology lags behind these advances. In this article we use complexity science to develop a theory on experienced complexity in the daily practice of primary care psychologists. We briefly answer the ontological question of what we see (from the perspective of primary care psychology) as reality, the epistemological question of what we can know, the methodological question of how to act, and the ethical question of what is good care. Following our empirical study, we conclude that complexity science can describe the experienced complexity of the psychologist and offer room for personalized client-centered care. Complexity science is slowly filling the gap between the dominant reductionist theory and complex daily practice.

  11. Sociocultural Influences On Undergraduate Women's Entry into a Computer Science Major

    NASA Astrophysics Data System (ADS)

    Lyon, Louise Ann

    Computer science not only displays the pattern of underrepresentation of many other science, technology, engineering, and math (STEM) fields, but has actually experienced a decline in the number of women choosing the field over the past two decades. Broken out by gender and race, the picture becomes more nuanced, with the ratio of females to males receiving bachelor's degrees in computer science higher for non-White ethnic groups than for Whites. This dissertation explores the experiences of university women differing along the axis of race, class, and culture who are considering majoring in computer science in order to highlight how well-prepared women are persuaded that they belong (or not) in the field and how the confluence of social categories plays out in their decision. This study focuses on a university seminar entitled "Women in Computer Science and Engineering" open to women concurrently enrolled in introductory programming and uses an ethnographic approach including classroom participant observation, interviews with seminar students and instructors, observations of students in other classes, and interviews with parents of students. Three stand-alone but related articles explore various aspects of the experiences of women who participated in the study using Rom Harre's positioning theory as a theoretical framework. The first article uses data from twenty-two interviews to uncover how interactions with others and patterns in society position women in relation to a computer science major, and how these women have arrived at the point of considering the major despite messages that they do not belong. The second article more deeply explores the cases of three women who vary greatly along the axes of race, class, and culture in order to uncover pattern and interaction differences for women based on their ethnic background. The final article focuses on the attitudes and expectations of the mothers of three students of contrasting ethnicities and how reported interactions between mothers and daughters either constrain or afford opportunities for the daughters to choose a computer science major.

  12. Students Teach Students: Alternative Teaching in Greek Secondary Education

    ERIC Educational Resources Information Center

    Theodoropoulos, Anastasios; Antoniou, Angeliki; Lepouras, George

    2016-01-01

    The students of a Greek junior high school collaborated to prepare the teaching material of a theoretical Computer Science (CS) course and then shared their understanding with other students. This study investigates two alternative teaching methods (collaborative learning and peer tutoring) and compares the learning results to the traditional…

  13. Theoretical Branches in Teaching Computer Science

    ERIC Educational Resources Information Center

    Habiballa, Hashim; Kmet, Tibor

    2004-01-01

    The present paper describes an educational experiment dealing with teaching the theory of formal languages and automata as well as their application concepts. It presents a practical application of an educational experiment and initial results based on comparative instruction of two samples of students (n = 56). The application concept should…

  14. Learning Physical Domains: Toward a Theoretical Framework.

    DTIC Science & Technology

    1986-12-01

    advanced ids o the iaime doinain in containing more information, especially perceptual " ’It. iho lI b1 rwt... tI hat. psychboigists by no means...Acquisitions Dr Kenneth D Forbus 4833 Rugby Avenue University of Illinois Dr Robert Glaser Bethesda, MD 20014 Department of Computer Science Learning

  15. Generating finite cyclic and dihedral groups using sequential insertion systems with interactions

    NASA Astrophysics Data System (ADS)

    Fong, Wan Heng; Sarmin, Nor Haniza; Turaev, Sherzod; Yosman, Ahmad Firdaus

    2017-04-01

    The operation of insertion has been studied extensively throughout the years for its impact in many areas of theoretical computer science such as DNA computing. First introduced as a generalization of the concatenation operation, many variants of insertion have been introduced, each with their own computational properties. In this paper, we introduce a new variant that enables the generation of some special types of groups called sequential insertion systems with interactions. We show that these new systems are able to generate all finite cyclic and dihedral groups.

  16. 2016 Energetic Materials Gordon Research Conference and Gordon Research Seminar Research Area 7: Chemical Sciences 7.0 Chemical Sciences (Dr. James K. Parker)

    DTIC Science & Technology

    2016-08-10

    thermal decomposition and mechanical damage of energetics. The program for the meeting included nine oral presentation sessions. Discussion leaders...USA) 7:30 pm - 7:35 pm Introduction by Discussion Leader 7:35 pm - 7:50 pm Vincent Baijot (Laboratory for Analysis and Architecture of Systems , CNRS...were synthesis of new materials, performance, advanced diagnostics, experimental techniques, theoretical approaches, and computational models for

  17. What is biomedical informatics?

    PubMed Central

    Bernstam, Elmer V.; Smith, Jack W.; Johnson, Todd R.

    2009-01-01

    Biomedical informatics lacks a clear and theoretically grounded definition. Many proposed definitions focus on data, information, and knowledge, but do not provide an adequate definition of these terms. Leveraging insights from the philosophy of information, we define informatics as the science of information, where information is data plus meaning. Biomedical informatics is the science of information as applied to or studied in the context of biomedicine. Defining the object of study of informatics as data plus meaning clearly distinguishes the field from related fields, such as computer science, statistics and biomedicine, which have different objects of study. The emphasis on data plus meaning also suggests that biomedical informatics problems tend to be difficult when they deal with concepts that are hard to capture using formal, computational definitions. In other words, problems where meaning must be considered are more difficult than problems where manipulating data without regard for meaning is sufficient. Furthermore, the definition implies that informatics research, teaching, and service should focus on biomedical information as data plus meaning rather than only computer applications in biomedicine. PMID:19683067

  18. MaRIE theory, modeling and computation roadmap executive summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lookman, Turab

    The confluence of MaRIE (Matter-Radiation Interactions in Extreme) and extreme (exascale) computing timelines offers a unique opportunity in co-designing the elements of materials discovery, with theory and high performance computing, itself co-designed by constrained optimization of hardware and software, and experiments. MaRIE's theory, modeling, and computation (TMC) roadmap efforts have paralleled 'MaRIE First Experiments' science activities in the areas of materials dynamics, irradiated materials and complex functional materials in extreme conditions. The documents that follow this executive summary describe in detail for each of these areas the current state of the art, the gaps that exist and the road mapmore » to MaRIE and beyond. Here we integrate the various elements to articulate an overarching theme related to the role and consequences of heterogeneities which manifest as competing states in a complex energy landscape. MaRIE experiments will locate, measure and follow the dynamical evolution of these heterogeneities. Our TMC vision spans the various pillar science and highlights the key theoretical and experimental challenges. We also present a theory, modeling and computation roadmap of the path to and beyond MaRIE in each of the science areas.« less

  19. The advanced role of computational mechanics and visualization in science and technology: analysis of the Germanwings Flight 9525 crash

    NASA Astrophysics Data System (ADS)

    Chen, Goong; Wang, Yi-Ching; Perronnet, Alain; Gu, Cong; Yao, Pengfei; Bin-Mohsin, Bandar; Hajaiej, Hichem; Scully, Marlan O.

    2017-03-01

    Computational mathematics, physics and engineering form a major constituent of modern computational science, which now stands on an equal footing with the established branches of theoretical and experimental sciences. Computational mechanics solves problems in science and engineering based upon mathematical modeling and computing, bypassing the need for expensive and time-consuming laboratory setups and experimental measurements. Furthermore, it allows the numerical simulations of large scale systems, such as the formation of galaxies that could not be done in any earth bound laboratories. This article is written as part of the 21st Century Frontiers Series to illustrate some state-of-the-art computational science. We emphasize how to do numerical modeling and visualization in the study of a contemporary event, the pulverizing crash of the Germanwings Flight 9525 on March 24, 2015, as a showcase. Such numerical modeling and the ensuing simulation of aircraft crashes into land or mountain are complex tasks as they involve both theoretical study and supercomputing of a complex physical system. The most tragic type of crash involves ‘pulverization’ such as the one suffered by this Germanwings flight. Here, we show pulverizing airliner crashes by visualization through video animations from supercomputer applications of the numerical modeling tool LS-DYNA. A sound validation process is challenging but essential for any sophisticated calculations. We achieve this by validation against the experimental data from a crash test done in 1993 of an F4 Phantom II fighter jet into a wall. We have developed a method by hybridizing two primary methods: finite element analysis and smoothed particle hydrodynamics. This hybrid method also enhances visualization by showing a ‘debris cloud’. Based on our supercomputer simulations and the visualization, we point out that prior works on this topic based on ‘hollow interior’ modeling can be quite problematic and, thus, not likely to be correct. We discuss the effects of terrain on pulverization using the information from the recovered flight-data-recorder and show our forensics and assessments of what may have happened during the final moments of the crash. Finally, we point out that our study has potential for being made into real-time flight crash simulators to help the study of crashworthiness and survivability for future aviation safety. Some forward-looking statements are also made.

  20. Reviews Book: Marie Curie: A Biography Book: Fast Car Physics Book: Beautiful Invisible Equipment: Fun Fly Stick Science Kit Book: Quantum Theory Cannot Hurt You Book: Chaos: The Science of Predictable Random Motion Book: Seven Wonders of the Universe Book: Special Relativity Equipment: LabVIEWTM 2009 Education Edition Places to Visit: Edison and Ford Winter Estates Places to Visit: The Computer History Museum Web Watch

    NASA Astrophysics Data System (ADS)

    2011-07-01

    WE RECOMMEND Fun Fly Stick Science Kit Fun fly stick introduces electrostatics to youngsters Special Relativity Text makes a useful addition to the study of relativity as an undergraduate LabVIEWTM 2009 Education Edition LabVIEW sets industry standard for gathering and analysing data, signal processing, instrumentation design and control, and automation and robotics Edison and Ford Winter Estates Thomas Edison's home is open to the public The Computer History Museum Take a walk through technology history at this computer museum WORTH A LOOK Fast Car Physics Book races through physics Beautiful Invisible The main subject of this book is theoretical physics Quantum Theory Cannot Hurt You A guide to physics on the large and small scale Chaos: The Science of Predictable Random Motion Book explores the mathematics behind chaotic behaviour Seven Wonders of the Universe A textual trip through the wonderful universe HANDLE WITH CARE Marie Curie: A Biography Book fails to capture Curie's science WEB WATCH Web clips to liven up science lessons

  1. Applications of Derandomization Theory in Coding

    NASA Astrophysics Data System (ADS)

    Cheraghchi, Mahdi

    2011-07-01

    Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.

  2. Assessing collaborative computing: development of the Collaborative-Computing Observation Instrument (C-COI)

    NASA Astrophysics Data System (ADS)

    Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.

    2016-07-01

    This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.

  3. Service-Oriented Architectures and Project Optimization for a Special Cost Management Problem Creating Synergies for Informed Change between Qualitative and Quantitative Strategic Management Processes

    DTIC Science & Technology

    2010-05-01

    Science, Werner Heisenberg -Weg 39,85577 Neubiberg, Germany,CA,93943 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...University of the Federal Armed Forces of Germany Institute for Theoretic Computer Science Mathematics and Operations Research Werner Heisenberg -Weg...Research Werner Heisenberg -Weg 39 85577 Neubiberg, Germany Phone +49 89 6004 2400 Marco Schuler—Marco Schuler is an active Officer of the Federal

  4. Computational intelligence in earth sciences and environmental applications: issues and challenges.

    PubMed

    Cherkassky, V; Krasnopolsky, V; Solomatine, D P; Valdes, J

    2006-03-01

    This paper introduces a generic theoretical framework for predictive learning, and relates it to data-driven and learning applications in earth and environmental sciences. The issues of data quality, selection of the error function, incorporation of the predictive learning methods into the existing modeling frameworks, expert knowledge, model uncertainty, and other application-domain specific problems are discussed. A brief overview of the papers in the Special Issue is provided, followed by discussion of open issues and directions for future research.

  5. Uncertainty quantification based on pillars of experiment, theory, and computation. Part I: Data analysis

    NASA Astrophysics Data System (ADS)

    Elishakoff, I.; Sarlin, N.

    2016-06-01

    In this paper we provide a general methodology of analysis and design of systems involving uncertainties. Available experimental data is enclosed by some geometric figures (triangle, rectangle, ellipse, parallelogram, super ellipse) of minimum area. Then these areas are inflated resorting to the Chebyshev inequality in order to take into account the forecasted data. Next step consists in evaluating response of system when uncertainties are confined to one of the above five suitably inflated geometric figures. This step involves a combined theoretical and computational analysis. We evaluate the maximum response of the system subjected to variation of uncertain parameters in each hypothesized region. The results of triangular, interval, ellipsoidal, parallelogram, and super ellipsoidal calculi are compared with the view of identifying the region that leads to minimum of maximum response. That response is identified as a result of the suggested predictive inference. The methodology thus synthesizes probabilistic notion with each of the five calculi. Using the term "pillar" in the title was inspired by the News Release (2013) on according Honda Prize to J. Tinsley Oden, stating, among others, that "Dr. Oden refers to computational science as the "third pillar" of scientific inquiry, standing beside theoretical and experimental science. Computational science serves as a new paradigm for acquiring knowledge and informing decisions important to humankind". Analysis of systems with uncertainties necessitates employment of all three pillars. The analysis is based on the assumption that that the five shapes are each different conservative estimates of the true bounding region. The smallest of the maximal displacements in x and y directions (for a 2D system) therefore provides the closest estimate of the true displacements based on the above assumption.

  6. Computational complexity of ecological and evolutionary spatial dynamics

    PubMed Central

    Ibsen-Jensen, Rasmus; Chatterjee, Krishnendu; Nowak, Martin A.

    2015-01-01

    There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. There are classes of harder problems for which the fastest possible algorithm requires exponential time. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. Precise complexity results can be notoriously difficult. The famous question whether polynomial time equals nondeterministic polynomial time (i.e., P = NP) is one of the hardest open problems in computer science and all of mathematics. Here, we consider simple processes of ecological and evolutionary spatial dynamics. The basic question is: What is the probability that a new invader (or a new mutant) will take over a resident population? We derive precise complexity results for a variety of scenarios. We therefore show that some fundamental questions in this area cannot be answered by simple equations (assuming that P is not equal to NP). PMID:26644569

  7. Recursion and the Competence/Performance Distinction in AGL Tasks

    ERIC Educational Resources Information Center

    Lobina, David J.

    2011-01-01

    The term "recursion" is used in at least four distinct theoretical senses within cognitive science. Some of these senses in turn relate to the different levels of analysis described by David Marr some 20 years ago; namely, the underlying competence capacity (the "computational" level), the performance operations used in real-time processing (the…

  8. Group Emotions: The Social and Cognitive Functions of Emotions in Argumentation

    ERIC Educational Resources Information Center

    Polo, Claire; Lund, Kristine; Plantin, Christian; Niccolai, Gerald P.

    2016-01-01

    The learning sciences of today recognize the tri-dimensional nature of learning as involving cognitive, social and emotional phenomena. However, many computer-supported argumentation systems still fail in addressing the socio-emotional aspects of group reasoning, perhaps due to a lack of an integrated theoretical vision of how these three…

  9. Drawing Analogies between Logic Programming and Natural Language Argumentation Texts to Scaffold Learners' Understanding

    ERIC Educational Resources Information Center

    Ragonis, Noa; Shilo, Gila

    2014-01-01

    The paper presents a theoretical investigational study of the potential advantages that secondary school learners may gain from learning two different subjects, namely, logic programming within computer science studies and argumentation texts within linguistics studies. The study suggests drawing an analogy between the two subjects since they both…

  10. Proof Theory for Authorization Logic and Its Application to a Practical File System

    DTIC Science & Technology

    2009-12-01

    Holland, 1969. [71] Jean-Yves Girard. Linear logic. Theoretical Computer Science, 50:1–102, 1987 . [72] Jean-Yves Girard, Paul Taylor, and Yves Lafont...2009. Online at http://ecommons.library.cornell.edu/handle/1813/13679. [133] S. Shepler, B. Callaghan, D. Robinson, R. Thurlow, C. Beame, M. Eisler , and

  11. Numerical and Theoretical Considerations for the Design of the AVT-183 Diamond-Wing Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Boelens, Okko J.; Luckring, James M.; Breitsamter, Christian; Hovelmann, Andreas; Knoth, Florian; Malloy, Donald J.; Deck, Sebatien

    2015-01-01

    A diamond-wing configuration has been developed to isolate and study blunt-leading edge vortex separation with both computations and experiments. The wing has been designed so that the results are relevant to a more complex Uninhabited Combat Air Vehicle concept known as SACCON. The numerical and theoretical development process for this diamond wing is presented, including a view toward planned wind tunnel experiments. This work was conducted under the NATO Science and Technology Organization, Applied Vehicle Technology panel. All information is in the public domain.

  12. The philosophy of scientific experimentation: a review

    PubMed Central

    2009-01-01

    Practicing and studying automated experimentation may benefit from philosophical reflection on experimental science in general. This paper reviews the relevant literature and discusses central issues in the philosophy of scientific experimentation. The first two sections present brief accounts of the rise of experimental science and of its philosophical study. The next sections discuss three central issues of scientific experimentation: the scientific and philosophical significance of intervention and production, the relationship between experimental science and technology, and the interactions between experimental and theoretical work. The concluding section identifies three issues for further research: the role of computing and, more specifically, automating, in experimental research, the nature of experimentation in the social and human sciences, and the significance of normative, including ethical, problems in experimental science. PMID:20098589

  13. A Haptic-Enhanced System for Molecular Sensing

    NASA Astrophysics Data System (ADS)

    Comai, Sara; Mazza, Davide

    The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.

  14. A neuromathematical model of human information processing and its application to science content acquisition

    NASA Astrophysics Data System (ADS)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  15. Human-Computer Interaction: A Journal of Theoretical, Empirical and Methodological Issues of User Science and of System Design. Volume 7, Number 1

    DTIC Science & Technology

    1992-01-01

    Norman .................................... University of California, San Diego, CA Dan R . Olsen, Jr ........................................ Brigham...Peter G. Poison .............................................. University of Colorado, Boulder, CO James R . Rhyne ................. IBM T J Watson...and artificial intelligence, among which are: * reasoning about concurrent systems, including program verification ( Barringer , 1985), operating

  16. Using quantum chemistry muscle to flex massive systems: How to respond to something perturbing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertoni, Colleen

    Computational chemistry uses the theoretical advances of quantum mechanics and the algorithmic and hardware advances of computer science to give insight into chemical problems. It is currently possible to do highly accurate quantum chemistry calculations, but the most accurate methods are very computationally expensive. Thus it is only feasible to do highly accurate calculations on small molecules, since typically more computationally efficient methods are also less accurate. The overall goal of my dissertation work has been to try to decrease the computational expense of calculations without decreasing the accuracy. In particular, my dissertation work focuses on fragmentation methods, intermolecular interactionsmore » methods, analytic gradients, and taking advantage of new hardware.« less

  17. Control of Chaos: New Perspectives in Experimental and Theoretical Science. International Journal of Bifurcation and Chaos in Applied Sciences and Engineering. Theme Issue. Part 2, Volume 8, Number 9, September 1998.

    DTIC Science & Technology

    1998-09-01

    discharges in the Onchidium pacemaker neu- "Episodic multiregional cortical coherence at multiple ron," J. Theor. Biol. 156, 269-291. frequencies during...with delay: A model of synchronization of Sepulchre, J. A. & Babloyantz, A. [1993] "Controlling cortical tissue," Neural Comput. 6, 1141-1154...generating circuit of different 363, 411 417. networks," Nature 351, 60-63. Singer, W. [1993] "Synchronization of cortical activity Mpitsos, G. J., Burton, R

  18. Lobachevsky Year at Kazan University: Center of Science, Education, Intellectual-Cognitive Tourism "Kazan - GeoNa - 2020+" and "Kazan-Moon-2020+" projects

    NASA Astrophysics Data System (ADS)

    Gusev, A.; Trudkova, N.

    2017-09-01

    Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.

  19. Building a Unified Computational Model for the Resonant X-Ray Scattering of Strongly Correlated Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bansil, Arun

    2016-12-01

    Basic-Energy Sciences of the Department of Energy (BES/DOE) has made large investments in x-ray sources in the U.S. (NSLS-II, LCLS, NGLS, ALS, APS) as powerful enabling tools for opening up unprecedented new opportunities for exploring properties of matter at various length and time scales. The coming online of the pulsed photon source literally allows us to see and follow the dynamics of processes in materials at their natural timescales. There is an urgent need therefore to develop theoretical methodologies and computational models for understanding how x-rays interact with matter and the related spectroscopies of materials. The present project addressed aspectsmore » of this grand challenge of X-ray science. In particular, our Collaborative Research Team (CRT) focused on understanding and modeling of elastic and inelastic resonant X-ray scattering processes. We worked to unify the three different computational approaches currently used for modeling X-ray scattering—density functional theory, dynamical mean-field theory, and small-cluster exact diagonalization—to achieve a more realistic material-specific picture of the interaction between X-rays and complex matter. To achieve a convergence in the interpretation and to maximize complementary aspects of different theoretical methods, we concentrated on the cuprates, where most experiments have been performed. Our team included both US and international researchers, and it fostered new collaborations between researchers currently working with different approaches. In addition, we developed close relationships with experimental groups working in the area at various synchrotron facilities in the US. Our CRT thus helped toward enabling the US to assume a leadership role in the theoretical development of the field, and to create a global network and community of scholars dedicated to X-ray scattering research.« less

  20. Introduction to focus issue: intrinsic and designed computation: information processing in dynamical systems--beyond the digital hegemony.

    PubMed

    Crutchfield, James P; Ditto, William L; Sinha, Sudeshna

    2010-09-01

    How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.

  1. Computing the Ediz eccentric connectivity index of discrete dynamic structures

    NASA Astrophysics Data System (ADS)

    Wu, Hualong; Kamran Siddiqui, Muhammad; Zhao, Bo; Gan, Jianhou; Gao, Wei

    2017-06-01

    From the earlier studies in physical and chemical sciences, it is found that the physico-chemical characteristics of chemical compounds are internally connected with their molecular structures. As a theoretical basis, it provides a new way of thinking by analyzing the molecular structure of the compounds to understand their physical and chemical properties. In our article, we study the physico-chemical properties of certain molecular structures via computing the Ediz eccentric connectivity index from mathematical standpoint. The results we yielded mainly apply to the techniques of distance and degree computation of mathematical derivation, and the conclusions have guiding significance in physical engineering.

  2. Discrete Mathematics in the Schools. DIMACS Series in Discrete Mathematics and Theoretical Computer Science, Volume 36.

    ERIC Educational Resources Information Center

    Rosenstein, Joseph G., Ed.; Franzblau, Deborah S., Ed.; Roberts, Fred S., Ed.

    This book is a collection of articles by experienced educators and explains why and how discrete mathematics should be taught in K-12 classrooms. It includes evidence for "why" and practical guidance for "how" and also discusses how discrete mathematics can be used as a vehicle for achieving the broader goals of the major…

  3. An Action Research Approach to the Design, Development and Evaluation of an Interactive E-Learning Tutorial in a Cognitive Domain

    ERIC Educational Resources Information Center

    de Villiers, M. Ruth

    2007-01-01

    The teaching and learning of a complex section in "Theoretical Computer Science 1" in a distance-education context at the University of South Africa (UNISA) has been enhanced by a supplementary e-learning application called "Relations," which interactively teaches mathematical skills in a cognitive domain. It has tutorial and…

  4. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  5. Toward integration of in vivo molecular computing devices: successes and challenges

    PubMed Central

    Hayat, Sikander; Hinze, Thomas

    2008-01-01

    The computing power unleashed by biomolecule based massively parallel computational units has been the focus of many interdisciplinary studies that couple state of the art ideas from mathematical logic, theoretical computer science, bioengineering, and nanotechnology to fulfill some computational task. The output can influence, for instance, release of a drug at a specific target, gene expression, cell population, or be a purely mathematical entity. Analysis of the results of several studies has led to the emergence of a general set of rules concerning the implementation and optimization of in vivo computational units. Taking two recent studies on in vivo computing as examples, we discuss the impact of mathematical modeling and simulation in the field of synthetic biology and on in vivo computing. The impact of the emergence of gene regulatory networks and the potential of proteins acting as “circuit wires” on the problem of interconnecting molecular computing device subunits is also highlighted. PMID:19404433

  6. EDITORIAL: TaCoNa-Photonics 2008 TaCoNa-Photonics 2008

    NASA Astrophysics Data System (ADS)

    Chigrin, Dmitry N.; Busch, Kurt; Lavrinenko, Andrei V.

    2009-11-01

    This special section on theoretical and computational nano-photonics features papers presented at the first International Workshop on Theoretical and Computational Nano-Photonics (TaCoNa-Photonics 2008) held in Bad Honnef, Germany, 3-5 December 2008. The workshop covered a broad range of topics related to current developments and achievements in this interdisciplinary area of research. Since the late 1960s, the word `photonics' has been understood as the science of generating, controlling, and detecting light. Nowadays, a routine fabrication of complex structures with micro- and nano-scale dimensions opens up many new and exciting possibilities in photonics. The science of generating, routing and detecting light in micro- and nano-structured matter, `nano-photonics', is becoming more important both in research and technology and offers many promising applications. The inherently sub-wavelength character of the structures that nano-photonics deals with challenges modern theoretical and computational physics and engineering with many nontrivial questions: Up to what length-scale can one use a macroscopic phenomenological description of matter? Where is the interface between the classical and quantum description of light in nano-scale structures? How can one combine different physical systems, different time- and length-scales in a single computational model? How can one engineer nano-structured materials in order to achieve the desired optical properties for particular applications? Any attempt at answering these kinds of questions is impossible without the joint efforts of physicists, engineers, applied mathematicians and programmers. This is the reason why the major goal of the TaCoNa-Photonics workshops is to provide a forum where theoreticians and specialists in numerical methods from all branches of physics, engineering sciences and mathematics can compare their results, report on novel results and breakthroughs, and discuss new challenges ahead. In order to intensify theoretical discussions and to put them on `solid' ground it was decided to invite world-leading experts in experimental photonics for plenary talks. Over three days, the workshop has brought together more than 70 specialists in theoretical and computational nano-photonics. The workshop took place in the historical `Physikzentrum Bad Honnef', whose unique atmosphere supported a multitude of highly interesting debates and discussions that often lasted until midnight and beyond. Different theoretical and numerical aspects of light generation, control and detection in general inhomogeneous media, photonic crystals, plasmonic structures, metamaterials and integrated optical systems were covered in 15 invited talks and 52 contributed oral and posters presentations. The plenary talks were given by Professor M Wegener (metamaterials) and Professor W Barnes (plasmonics). This special section is a cross-sectional selection of papers which were submitted by the authors of invited and contributed oral presentations. It also includes two papers of the winners of the Best Poster Awards. We hope that these papers will enhance the interest of the scientific community regarding nano-photonics in general and regarding the TaCoNa-Photonics workshop series in particular. It is our distinct pleasure to acknowledge the generous financial support of our sponsors: Karlsruhe School of Optics & Photonics (KSOP) (Germany), U.S. Army International Technology Center-Atlantic, Research Division (USA), and the Office of Naval Research Global (USA). Without the organizational assistance from the International Department of the Universität Karlsruhe GmbH (Germany) this event would simply have been impossible.

  7. Editorial. Festschrift on the occasion of Kurt Kremer's 60th birthday

    NASA Astrophysics Data System (ADS)

    Site, Luigi Delle; Deserno, Markus; Dünweg, Burkhard; Holm, Christian; Peter, Christine; Pleiner, Harald

    2016-10-01

    This special topics issue offers a broad perspective on recent theoretical and computational soft matter science, providing state of the art advances in many of its sub-fields. As is befitting for a discipline as diverse as soft matter, the papers collected here span a considerable range of subjects and questions, but they also illustrate numerous connections into both fundamental science and technological/industrial applications, which have accompanied the field since its earliest days. This issue is dedicated to Kurt Kremer, on the occasion of his 60th birthday, honouring his role in establishing this exciting field and consolidating its standing in the frame of current science and technology.

  8. Influence of computational domain size on the pattern formation of the phase field crystals

    NASA Astrophysics Data System (ADS)

    Starodumov, Ilya; Galenko, Peter; Alexandrov, Dmitri; Kropotin, Nikolai

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) represents one of the important directions of modern computational materials science. This method makes it possible to research the formation of stable or metastable crystal structures. In this paper, we study the effect of computational domain size on the crystal pattern formation obtained as a result of computer simulation by the PFC method. In the current report, we show that if the size of a computational domain is changed, the result of modeling may be a structure in metastable phase instead of pure stable state. The authors present a possible theoretical justification for the observed effect and provide explanations on the possible modification of the PFC method to account for this phenomenon.

  9. Enhancing student engagement to positively impact mathematics anxiety, confidence and achievement for interdisciplinary science subjects

    NASA Astrophysics Data System (ADS)

    Everingham, Yvette L.; Gyuris, Emma; Connolly, Sean R.

    2017-11-01

    Contemporary science educators must equip their students with the knowledge and practical know-how to connect multiple disciplines like mathematics, computing and the natural sciences to gain a richer and deeper understanding of a scientific problem. However, many biology and earth science students are prejudiced against mathematics due to negative emotions like high mathematical anxiety and low mathematical confidence. Here, we present a theoretical framework that investigates linkages between student engagement, mathematical anxiety, mathematical confidence, student achievement and subject mastery. We implement this framework in a large, first-year interdisciplinary science subject and monitor its impact over several years from 2010 to 2015. The implementation of the framework coincided with an easing of anxiety and enhanced confidence, as well as higher student satisfaction, retention and achievement. The framework offers interdisciplinary science educators greater flexibility and confidence in their approach to designing and delivering subjects that rely on mathematical concepts and practices.

  10. Application of Psychological Theories in Agent-Based Modeling: The Case of the Theory of Planned Behavior.

    PubMed

    Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo

    2018-01-01

    It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.

  11. Using an Adaptive Expertise Lens to Understand the Quality of Teachers' Classroom Implementation of Computer-Supported Complex Systems Curricula in High School Science

    ERIC Educational Resources Information Center

    Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric

    2015-01-01

    Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to…

  12. The Packing Property

    DTIC Science & Technology

    2000-11-01

    Discrete Math . 115, 141-152. [7] Edmonds J., Giles R. (1977) A Min-Max relation for submodular functions on graphs, Annals of Discrete Math . 1, 185...projective planes, handwritten man- uscript, published: (1990) Polyhedral Combinatorics (W. Cook, P.D. Seymour eds.), DIMACS Series in Discrete Math . and...Theoretical Computer Science 1, 101-105. [11] Lovasz L. (1972) Normal hypergraphs and the perfect graph conjecture, Discrete Math . 2, 253-267. [12

  13. An In-Depth Analysis of Teaching Themes and the Quality of Teaching in Higher Education: Evidence from the Programming Education Environments

    ERIC Educational Resources Information Center

    Xia, Belle Selene

    2017-01-01

    Education research in computer science has emphasized the research of web-based learning environments as a result of the latest technological advancement in higher education. Our research aim is to offer new insights on the different teaching strategies in programming education both from a theoretical and empirical point of view as a response to…

  14. Cortical Substrate of Haptic Representation

    DTIC Science & Technology

    1993-08-24

    experience and data from primates , we have developed computational models of short-term active memory. Such models may have technological interest...neurobiological work on primate memory. It is on that empirical work that our current theoretical efforts are 5 founded. Our future physiological research...Academy of Sciences, New York, vol. 608, pp. 318-329, 1990. J.M. Fuster - Behavioral electrophysiology of the prefrontal cortex of the primate . Progress

  15. A Computational Model of Linguistic Humor in Puns.

    PubMed

    Kao, Justine T; Levy, Roger; Goodman, Noah D

    2016-07-01

    Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we propose two information-theoretic measures-ambiguity and distinctiveness-derived from a simple model of sentence processing. We test these measures on a set of puns and regular sentences and show that they correlate significantly with human judgments of funniness. Moreover, within a set of puns, the distinctiveness measure distinguishes exceptionally funny puns from mediocre ones. Our work is the first, to our knowledge, to integrate a computational model of general language understanding and humor theory to quantitatively predict humor at a fine-grained level. We present it as an example of a framework for applying models of language processing to understand higher level linguistic and cognitive phenomena. © 2015 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  16. P3: a practice focused learning environment

    NASA Astrophysics Data System (ADS)

    Irving, Paul W.; Obsniuk, Michael J.; Caballero, Marcos D.

    2017-09-01

    There has been an increased focus on the integration of practices into physics curricula, with a particular emphasis on integrating computation into the undergraduate curriculum of scientists and engineers. In this paper, we present a university-level, introductory physics course for science and engineering majors at Michigan State University called P3 (projects and practices in physics) that is centred around providing introductory physics students with the opportunity to appropriate various science and engineering practices. The P3 design integrates computation with analytical problem solving and is built upon a curriculum foundation of problem-based learning, the principles of constructive alignment and the theoretical framework of community of practice. The design includes an innovative approach to computational physics instruction, instructional scaffolds, and a unique approach to assessment that enables instructors to guide students in the development of the practices of a physicist. We present the very positive student related outcomes of the design gathered via attitudinal and conceptual inventories and research interviews of students’ reflecting on their experiences in the P3 classroom.

  17. Computational methods in the exploration of the classical and statistical mechanics of celestial scale strings: Rotating Space Elevators

    NASA Astrophysics Data System (ADS)

    Knudsen, Steven; Golubovic, Leonardo

    2015-04-01

    With the advent of ultra-strong materials, the Space Elevator has changed from science fiction to real science. We discuss computational and theoretical methods we developed to explore classical and statistical mechanics of rotating Space Elevators (RSE). An RSE is a loopy string reaching deep into outer space. The floppy RSE loop executes a motion which is nearly a superposition of two rotations: geosynchronous rotation around the Earth, and yet another faster rotational motion of the string which goes on around a line perpendicular to the Earth at its equator. Strikingly, objects sliding along the RSE loop spontaneously oscillate between two turning points, one of which is close to the Earth (starting point) whereas the other one is deeply in the outer space. The RSE concept thus solves a major problem in space elevator science which is how to supply energy to the climbers moving along space elevator strings. The exploration of the dynamics of a floppy string interacting with objects sliding along it has required development of novel finite element algorithms described in this presentation. We thank Prof. Duncan Lorimer of WVU for kindly providing us access to his computational facility.

  18. Teacher challenges, perceptions, and use of science models in middle school classrooms about climate, weather, and energy concepts

    NASA Astrophysics Data System (ADS)

    Yarker, Morgan Brown

    Research suggests that scientific models and modeling should be topics covered in K-12 classrooms as part of a comprehensive science curriculum. It is especially important when talking about topics in weather and climate, where computer and forecast models are the center of attention. There are several approaches to model based inquiry, but it can be argued, theoretically, that science models can be effectively implemented into any approach to inquiry if they are utilized appropriately. Yet, it remains to be explored how science models are actually implemented in classrooms. This study qualitatively looks at three middle school science teachers' use of science models with various approaches to inquiry during their weather and climate units. Results indicate that the teacher who used the most elements of inquiry used models in a way that aligned best with the theoretical framework than the teachers who used fewer elements of inquiry. The theoretical framework compares an approach to argument-based inquiry to model-based inquiry, which argues that the approaches are essentially identical, so teachers who use inquiry should be able to apply model-based inquiry using the same approach. However, none of the teachers in this study had a complete understanding of the role models play in authentic science inquiry, therefore students were not explicitly exposed to the ideas that models can be used to make predictions about, and are representations of, a natural phenomenon. Rather, models were explicitly used to explain concepts to students or have students explain concepts to the teacher or to each other. Additionally, models were used as a focal point for conversation between students, usually as they were creating, modifying, or using models. Teachers were not observed asking students to evaluate models. Since science models are an important aspect of understanding science, it is important that teachers not only know how to implement models into an inquiry environment, but also understand the characteristics of science models so that they can explicitly teach the concept of modeling to students. This study suggests that better pre-service and in-service teacher education is needed to prepare students to teach about science models effectively.

  19. FOREWORD: Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology and Mathematics

    NASA Astrophysics Data System (ADS)

    Kaski, K.; Salomaa, M.

    1990-01-01

    These are Proceedings of the Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology, and Mathematics, held August 25-26, 1989, at Lahti (Finland). The Symposium belongs to an annual series of Meetings, the first one of which was arranged in 1987 at Lund (Sweden) and the second one in 1988 at Kolle-Kolle near Copenhagen (Denmark). Although these Symposia have thus far been essentially Nordic events, their international character has increased significantly; the trend is vividly reflected through contributions in the present Topical Issue. The interdisciplinary nature of Computational Science is central to the activity; this fundamental aspect is also responsible, in an essential way, for its rapidly increasing impact. Crucially important to a wide spectrum of superficially disparate fields is the common need for extensive - and often quite demanding - computational modelling. For such theoretical models, no closed-form (analytical) solutions are available or they would be extremely difficult to find; hence one must rather resort to the Art of performing computational investigations. Among the unifying features in the computational research are the methods of simulation employed; methods which frequently are quite closely related with each other even for faculties of science that are quite unrelated. Computer simulation in Natural Sciences is presently apprehended as a discipline on its own right, occupying a broad region somewhere between the experimental and theoretical methods, but also partially overlapping with and complementing them. - Whichever its proper definition may be, the computational approach serves as a novel and an extremely versatile tool with which one can equally well perform "pure" experimental modelling and conduct "computational theory". Computational studies that have earlier been made possible only through supercomputers have opened unexpected, as well as exciting, novel frontiers equally in mathematics (e.g., fractals), physics (fluid-dynamical and quantum-mechanical calculations; extensive numerical simulations of various condensed-matter systems; the development of stellar constellations, even the early Universe), chemistry (quantum-chemical calculations on the structures of new chemical compounds; chemical reactions and reaction dynamics), and biology (various models, for example, in population dynamics). We succeeded in our effort to assemble several internationally recognized researchers of Computational Science to deliver invited talks on a couple of exceptionally beautiful late-summer days in the modern premises of the Adult Education Center at Lahti. Among the plenary speakers, Per Bak described his highly original work on self-organized criticality. David Ceperley discussed pioneering numerical simulations of superfluid helium in which, for the first time, Feynman's path-integral formulation of quantum mechanics has been implemented on a computer. Jim Gunton presented his comprehensive studies of the Cahn-Hilliard equation for the dynamics of ordering in a condensed-matter system far from equilibrium, while Alex Hansen explained those on nonlinear breakdown in disordered materials. Representing the important field of computational chemistry, Bo Jönsson dealt with attractive forces between polyelectrolytes. Kurt Kremer gave an interesting account on computer-simulation studies of complex polymer systems, while Ole Mouritsen reviewed studies of interfacial fluctuations in lipid membranes. Pekka Pyykkö introduced his pioneering work which has led to predictions of completely novel chemical species. Annette Zippelius gave an expert introduction to the highly active field of neural networks. It is evident from each of these intriguing plenary contributions that, indeed, the computational approach is a frontier field of science, possibly providing the most versatile research method available today. We also arranged a competition for the best Posters presented at the Symposium; the Prizes were some of the newest books on the beauty of fractals. The First Prize was won by Hanna Viertio, the Second Prize by Miguel Zendejas and the Third Prize was shared by Leo Kärkkäinen and Kari Rummukainen. As for the future of Computational Science, we identify two principal avenues: (a) big science - large centers with ultrafast supercomputers, and (b) small science - active groups utilizing personal minisupercomputers or supenvorkstations. At present, it appears that the latter already compete extremely favourably in their performance with the massive supercomputers - at least in their throughput and, especially, in tasks where a broad range of diverse software support is not absolutely necessary. In view of this important emergence of "personal supercomputing", we envisage that the role and the development of large computer centers will have to be reviewed critically and modified accordingly. Furthermore, a promise for some radically new approaches to Computational Science could be provided by massively parallel computers; among them, maybe solutions based on ideas of neural computing could be utilized, especially for restricted applications. Therefore, in order not to overlook any important advances within such a forefront field, one should rather choose the strategy of actively following each and every one of these routes. In perspective of the large variety of simultaneous developments, we want to emphasize the importance of Nordic collaboration in sharing expertise and experience in the rapidly progressing research - it ought to be cultivated and could be expanded. Therefore, we think that it is vitally important to continue with and to further promote the kind of Nordic Symposia that have been held at Lund, Kolle-Kolle, and Lahti. We want to thank most cordially the plenary and invited speakers, contributors, students, and in particular the Conference Secretary, Ms Ulla Ahlfors and Dr Milja Mäkelä, who was responsible for the local arrangements. The work that they did served to make this Symposium a scientific success and a useful and pleasant experience for all the well over 100 participants. We also thank the City of Lahti for kindly arranging a refreshing reception at the Town Hall. We wish to express our gratitude to Nordiska Kulturfonden, NORDITA, the Research Institute for Theoretical Physics at the University of Helsinki, the Finnish Ministry of Education and the Academy of Finland for their financial support. March 1990

  20. Syllabus Computer in Astronomy

    NASA Astrophysics Data System (ADS)

    Hojaev, Alisher S.

    2015-08-01

    One of the most important and actual subjects and training courses in the curricula for undergraduate level students at the National university of Uzbekistan is ‘Computer Methods in Astronomy’. It covers two semesters and includes both lecture and practice classes. Based on the long term experience we prepared the tutorial for students which contain the description of modern computer applications in astronomy.The main directions of computer application in field of astronomy briefly as follows:1) Automating the process of observation, data acquisition and processing2) Create and store databases (the results of observations, experiments and theoretical calculations) their generalization, classification and cataloging, working with large databases3) The decisions of the theoretical problems (physical modeling, mathematical modeling of astronomical objects and phenomena, derivation of model parameters to obtain a solution of the corresponding equations, numerical simulations), appropriate software creation4) The utilization in the educational process (e-text books, presentations, virtual labs, remote education, testing), amateur astronomy and popularization of the science5) The use as a means of communication and data transfer, research result presenting and dissemination (web-journals), the creation of a virtual information system (local and global computer networks).During the classes the special attention is drawn on the practical training and individual work of students including the independent one.

  1. Hands-on approach to teaching Earth system sciences using a information-computational web-GIS portal "Climate"

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Gorbatenko, Valentina; Martynova, Yulia; Shulgina, Tamara

    2014-05-01

    A problem of making education relevant to the workplace tasks is a key problem of higher education because old-school training programs are not keeping pace with the rapidly changing situation in the professional field of environmental sciences. A joint group of specialists from Tomsk State University and Siberian center for Environmental research and Training/IMCES SB RAS developed several new courses for students of "Climatology" and "Meteorology" specialties, which comprises theoretical knowledge from up-to-date environmental sciences with practical tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational platform "Climate" (http://climate.scert.ru/) using web GIS tools. These trainings contain practical tasks on climate modeling and climate changes assessment and analysis and should be performed using typical tools which are usually used by scientists performing such kind of research. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The hands-on approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern information and communication tools. The courses are implemented at Tomsk State University and help forming modern curriculum in Earth system science area. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grants numbers 13-05-12034 and 14-05-00502.

  2. Basic Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Basic Energy Sciences, November 3-5, 2015, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Windus, Theresa; Banda, Michael; Devereaux, Thomas

    Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. Wemore » could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy Sciences (BES) mission need. Simulation, visualization, and data analysis are crucial for advances in energy science and technology. Revolutionary mathematical, software, and algorithm developments are required in all areas of BES science to take advantage of exascale computing architectures and to meet data analysis, management, and workflow needs. In partnership with ASCR, BES has an emerging and pressing need to develop new and disruptive capabilities in data science. More capable and larger high-performance computing (HPC) and data ecosystems are required to support priority research in BES. Continued success in BES research requires developing the next-generation workforce through education and training and by providing sustained career opportunities.« less

  3. Generalized Ultrametric Semilattices of Linear Signals

    DTIC Science & Technology

    2014-01-23

    53–73, 1998. [8] John C. Eidson , Edward A. Lee, Slobodan Matic, Sanjit A. Seshia, and Jia Zou. Distributed real- time software for cyber-physical...Theoretical Computer Science, 16(1):5–24, 1981. 41 [37] Yang Zhao, Jie Liu, and Edward A. Lee. A programming model for time - synchronized distributed real...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and

  4. Atomistic Design and Simulations of Nanoscale Machines and Assembly

    NASA Technical Reports Server (NTRS)

    Goddard, William A., III; Cagin, Tahir; Walch, Stephen P.

    2000-01-01

    Over the three years of this project, we made significant progress on critical theoretical and computational issues in nanoscale science and technology, particularly in:(1) Fullerenes and nanotubes, (2) Characterization of surfaces of diamond and silicon for NEMS applications, (3) Nanoscale machine and assemblies, (4) Organic nanostructures and dendrimers, (5) Nanoscale confinement and nanotribology, (6) Dynamic response of nanoscale structures nanowires (metals, tubes, fullerenes), (7) Thermal transport in nanostructures.

  5. The Impact of Computer Science on the Development of Oulu ICT during 1985-1990

    NASA Astrophysics Data System (ADS)

    Oinas-Kukkonen, Henry; Similä, Jouni; Pulli, Petri; Oinas-Kukkonen, Harri; Kerola, Pentti

    The region of Oulu has been emphasizing the importance of electronics industry for its business growth since the 1960s. After a pitch-dark recession, the region developed in the 1990s into a new, well-established hub of information and communication technology (ICT) in Finland. The city with its 100,000 inhabitants occupied nearly 10,000 ICT professionals in 1995. This article will contribute to the body of research knowledge through analyzing the role of computer science, in particular information systems and software engineering, for the development of the ICT industry in Oulu in the latter half of the 1980s. This analysis is based on a variety of both primary and secondary sources. This article suggests that the system-theoretical and software-oriented research expertise played a key role for the rapid and successful ICT business development of the Oulu region.

  6. Asian consortium on computational materials science theme meeting on ;first principles analysis & experiment: Role in energy research; 22-24 september 2016, SRM University, Kattankulathur, Chennai, India (ACCMS-TM 2016)

    NASA Astrophysics Data System (ADS)

    Thapa, Ranjit; Kawazoe, Yoshiyuki

    2017-10-01

    The main objective of this meeting was to provide a platform for theoreticians and experimentalists working in the area of materials to come together and carry out cutting edge research in the field of energy by showcasing their ideas and innovations. The theme meeting was successful in attracting young researchers from both fields, sharing common research interests. Participation of more than 250 researchers in ACCMS-TM 2016 has successfully paved the way towards exchange of mutual research insights and establishment of promising research collaborations. To encourage the young participants' research efforts, three best posters, each named as ;KAWAZOE PRIZE; in theoretical category and two best posters named ;ACCMS-TM 2016 POSTER AWARD; for experimental contributions was selected. A new award named ;ACCMS MID-CAREER AWARD; for outstanding scientific contribution in the area of Computational Materials Science was constituted.

  7. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    NASA Astrophysics Data System (ADS)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  8. Students' explanations in complex learning of disciplinary programming

    NASA Astrophysics Data System (ADS)

    Vieira, Camilo

    Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or representing complex phenomena that are not easy to experiment with. Despite the relevance of CSE, current professionals and scientists are not well prepared to take advantage of this set of tools and methods. Computation is usually taught in an isolated way from engineering disciplines, and therefore, engineers do not know how to exploit CSE affordances. This dissertation intends to introduce computational tools and methods contextualized within the Materials Science and Engineering curriculum. Considering that learning how to program is a complex task, the dissertation explores effective pedagogical practices that can support student disciplinary and computational learning. Two case studies will be evaluated to identify the characteristics of effective worked examples in the context of CSE. Specifically, this dissertation explores students explanations of these worked examples in two engineering courses with different levels of transparency: a programming course in materials science and engineering glass box and a thermodynamics course involving computational representations black box. Results from this study suggest that students benefit in different ways from writing in-code comments. These benefits include but are not limited to: connecting xv individual lines of code to the overall problem, getting familiar with the syntax, learning effective algorithm design strategies, and connecting computation with their discipline. Students in the glass box context generate higher quality explanations than students in the black box context. These explanations are related to students prior experiences. Specifically, students with low ability to do programming engage in a more thorough explanation process than students with high ability. This dissertation concludes proposing an adaptation to the instructional principles of worked-examples for the context of CSE education.

  9. Experience of validation and tuning of turbulence models as applied to the problem of boundary layer separation on a finite-width wedge

    NASA Astrophysics Data System (ADS)

    Babulin, A. A.; Bosnyakov, S. M.; Vlasenko, V. V.; Engulatova, M. F.; Matyash, S. V.; Mikhailov, S. V.

    2016-06-01

    Modern differential turbulence models are validated by computing a separation zone generated in the supersonic flow past a compression wedge lying on a plate of finite width. The results of three- and two-dimensional computations based on the ( q-ω), SST, and Spalart-Allmaras turbulence models are compared with experimental data obtained for 8°, 25°, and 45° wedges by A.A. Zheltovodov at the Institute of Theoretical and Applied Mechanics of the Siberian Branch of the Russian Academy of Sciences. An original law-of-the-wall boundary condition and modifications of the SST model intended for improving the quality of the computed separation zone are described.

  10. 26TH AFOSR Chemical & Atmospheric Sciences Program Review FY81.

    DTIC Science & Technology

    1982-03-01

    AFOSR-80-0020, 2310/A2 N. Larsen Department of Electrical Engineering Cornell University Ithaca, New York 14853 Light Scattering and Absorption Kuo-Nan...0011; University of Florida 80-0015 (To MRO Contract DAAM 1816 NW G Street 29-78-G-0024), 2310/Al Gainesville, FL 32601 Atmospheric Absorption of...parameters for use in the theoretical spectroscopy, for updating the transmission/emission codes, and for computing molecular absorption /emission line

  11. JPRS Report, Science & Technology, USSR: Electronics & Electrical Engineering.

    DTIC Science & Technology

    1988-02-23

    calculations or design examples are cited in this purely theoretical treatment, it is noted that experimental data from an on-board microprocessor controlled ...The requirements placed on the design of the semiconductor devices used in such systems can be divided into two groups : 1) Assure the requisite...describes a computer-aided approach to the design of resonant arrays that results in equal losses in the on and off states of such control devices. An

  12. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  13. New Frontiers in Language Evolution and Development.

    PubMed

    Oller, D Kimbrough; Dale, Rick; Griebel, Ulrike

    2016-04-01

    This article introduces the Special Issue and its focus on research in language evolution with emphasis on theory as well as computational and robotic modeling. A key theme is based on the growth of evolutionary developmental biology or evo-devo. The Special Issue consists of 13 articles organized in two sections: A) Theoretical foundations and B) Modeling and simulation studies. All the papers are interdisciplinary in nature, encompassing work in biological and linguistic foundations for the study of language evolution as well as a variety of computational and robotic modeling efforts shedding light on how language may be developed and may have evolved. Copyright © 2016 Cognitive Science Society, Inc.

  14. Quantum rendering

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco O.; Gomez, Richard B.; Uhlmann, Jeffrey K.

    2003-08-01

    In recent years, computer graphics has emerged as a critical component of the scientific and engineering process, and it is recognized as an important computer science research area. Computer graphics are extensively used for a variety of aerospace and defense training systems and by Hollywood's special effects companies. All these applications require the computer graphics systems to produce high quality renderings of extremely large data sets in short periods of time. Much research has been done in "classical computing" toward the development of efficient methods and techniques to reduce the rendering time required for large datasets. Quantum Computing's unique algorithmic features offer the possibility of speeding up some of the known rendering algorithms currently used in computer graphics. In this paper we discuss possible implementations of quantum rendering algorithms. In particular, we concentrate on the implementation of Grover's quantum search algorithm for Z-buffering, ray-tracing, radiosity, and scene management techniques. We also compare the theoretical performance between the classical and quantum versions of the algorithms.

  15. Spatio-Temporal Data Analysis at Scale Using Models Based on Gaussian Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Michael

    Gaussian processes are the most commonly used statistical model for spatial and spatio-temporal processes that vary continuously. They are broadly applicable in the physical sciences and engineering and are also frequently used to approximate the output of complex computer models, deterministic or stochastic. We undertook research related to theory, computation, and applications of Gaussian processes as well as some work on estimating extremes of distributions for which a Gaussian process assumption might be inappropriate. Our theoretical contributions include the development of new classes of spatial-temporal covariance functions with desirable properties and new results showing that certain covariance models lead tomore » predictions with undesirable properties. To understand how Gaussian process models behave when applied to deterministic computer models, we derived what we believe to be the first significant results on the large sample properties of estimators of parameters of Gaussian processes when the actual process is a simple deterministic function. Finally, we investigated some theoretical issues related to maxima of observations with varying upper bounds and found that, depending on the circumstances, standard large sample results for maxima may or may not hold. Our computational innovations include methods for analyzing large spatial datasets when observations fall on a partially observed grid and methods for estimating parameters of a Gaussian process model from observations taken by a polar-orbiting satellite. In our application of Gaussian process models to deterministic computer experiments, we carried out some matrix computations that would have been infeasible using even extended precision arithmetic by focusing on special cases in which all elements of the matrices under study are rational and using exact arithmetic. The applications we studied include total column ozone as measured from a polar-orbiting satellite, sea surface temperatures over the Pacific Ocean, and annual temperature extremes at a site in New York City. In each of these applications, our theoretical and computational innovations were directly motivated by the challenges posed by analyzing these and similar types of data.« less

  16. On the Future of Thermochemical Databases, the Development of Solution Models and the Practical Use of Computational Thermodynamics in Volcanology, Geochemistry and Petrology: Can Innovations of Modern Data Science Democratize an Oligarchy?

    NASA Astrophysics Data System (ADS)

    Ghiorso, M. S.

    2014-12-01

    Computational thermodynamics (CT) has now become an essential tool of petrologic and geochemical research. CT is the basis for the construction of phase diagrams, the application of geothermometers and geobarometers, the equilibrium speciation of solutions, the construction of pseudosections, calculations of mass transfer between minerals, melts and fluids, and, it provides a means of estimating materials properties for the evaluation of constitutive relations in fluid dynamical simulations. The practical application of CT to Earth science problems requires data. Data on the thermochemical properties and the equation of state of relevant materials, and data on the relative stability and partitioning of chemical elements between phases as a function of temperature and pressure. These data must be evaluated and synthesized into a self consistent collection of theoretical models and model parameters that is colloquially known as a thermodynamic database. Quantitative outcomes derived from CT reply on the existence, maintenance and integrity of thermodynamic databases. Unfortunately, the community is reliant on too few such databases, developed by a small number of research groups, and mostly under circumstances where refinement and updates to the database lag behind or are unresponsive to need. Given the increasing level of reliance on CT calculations, what is required is a paradigm shift in the way thermodynamic databases are developed, maintained and disseminated. They must become community resources, with flexible and assessable software interfaces that permit easy modification, while at the same time maintaining theoretical integrity and fidelity to the underlying experimental observations. Advances in computational and data science give us the tools and resources to address this problem, allowing CT results to be obtained at the speed of thought, and permitting geochemical and petrological intuition to play a key role in model development and calibration.

  17. Perspective: Structural fluctuation of protein and Anfinsen's thermodynamic hypothesis

    NASA Astrophysics Data System (ADS)

    Hirata, Fumio; Sugita, Masatake; Yoshida, Masasuke; Akasaka, Kazuyuki

    2018-01-01

    The thermodynamics hypothesis, casually referred to as "Anfinsen's dogma," is described theoretically in terms of a concept of the structural fluctuation of protein or the first moment (average structure) and the second moment (variance and covariance) of the structural distribution. The new theoretical concept views the unfolding and refolding processes of protein as a shift of the structural distribution induced by a thermodynamic perturbation, with the variance-covariance matrix varying. Based on the theoretical concept, a method to characterize the mechanism of folding (or unfolding) is proposed. The transition state, if any, between two stable states is interpreted as a gap in the distribution, which is created due to an extensive reorganization of hydrogen bonds among back-bone atoms of protein and with water molecules in the course of conformational change. Further perspective to applying the theory to the computer-aided drug design, and to the material science, is briefly discussed.

  18. Caring science and the science of unitary human beings: a trans-theoretical discourse for nursing knowledge development.

    PubMed

    Watson, Jean; Smith, Marlaine C

    2002-03-01

    Two dominant discourses in contemporary nursing theory and knowledge development have evolved over the past few decades, in part by unitary science views and caring theories. Rogers' science of unitary human beings (SUHB) represents the unitary directions in nursing. Caring theories and related caring science (CS) scholarship represent the other. These two contemporary initiatives have generated two parallel, often controversial, seemingly separate and unrelated, trees of knowledge for nursing science. This paper explores the evolution of CS and its intersection with SUHB that have emerged in contemporary nursing literature. We present a case for integration, convergence, and creative synthesis of CS with SUHB. A trans-theoretical, trans-disciplinary context emerges, allowing nursing to sustain its caring ethic and ontology, within a unitary science. The authors critique and review the seminal, critical issues that have separated contemporary knowledge developments in CS and SUHB. Foundational issues of CS, and Watson's theory of transpersonal caring science (TCS), as a specific exemplar, are analysed, alongside parallel themes in SUHB. By examining hidden ethical-ontological and paradigmatic commonalities, trans-theoretical themes and connections are explored and revealed between TCS and SUHB. Through a creative synthesis of TCS and SUHB we explicate a distinct unitary view of human with a relational caring ontology and ethic that informs nursing as well as other sciences. The result: is a trans-theoretical, trans-disciplinary view for nursing knowledge development. Nursing's history has been to examine theoretical differences rather than commonalities. This trans-theoretical position moves nursing toward theoretical integration and creative synthesis, vs. separation, away from the 'Balkanization' of different theories. This initiative still maintains the integrity of different theories, while facilitating and inviting a new discourse for nursing science. The result: Unitary Caring Science that evokes both science and spirit.

  19. Benchmark Computation and Finite Element Performance Evaluation for a Rhombic Plate Bending Problem

    DTIC Science & Technology

    1987-09-01

    Physical Science and Technology University of Maryland, College Park, MD 20742, USA and Dip. Matematica - Universita di Pavia - 27100 Pavia - ITALY DTIC...University of Maryland, College Park,, MD 20742, USA , and Dip. Matematica - Universita di Pavia - 27100 Pavia - ITALY SFor Oe" -- 4- I , CA& 11 --l...drawn when based on the state of the art of both theoretical and experience field. The reliability has to be understood not only with respect to a

  20. Optimized Materials From First Principles Simulations: Are We There Yet?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galli, G; Gygi, F

    2005-07-26

    In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less

  1. Using text analysis to quantify the similarity and evolution of scientific disciplines

    PubMed Central

    Dias, Laércio; Scharloth, Joachim

    2018-01-01

    We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance. PMID:29410857

  2. Conceptual Knowledge Acquisition in Biomedicine: A Methodological Review

    PubMed Central

    Payne, Philip R.O.; Mendonça, Eneida A.; Johnson, Stephen B.; Starren, Justin B.

    2007-01-01

    The use of conceptual knowledge collections or structures within the biomedical domain is pervasive, spanning a variety of applications including controlled terminologies, semantic networks, ontologies, and database schemas. A number of theoretical constructs and practical methods or techniques support the development and evaluation of conceptual knowledge collections. This review will provide an overview of the current state of knowledge concerning conceptual knowledge acquisition, drawing from multiple contributing academic disciplines such as biomedicine, computer science, cognitive science, education, linguistics, semiotics, and psychology. In addition, multiple taxonomic approaches to the description and selection of conceptual knowledge acquisition and evaluation techniques will be proposed in order to partially address the apparent fragmentation of the current literature concerning this domain. PMID:17482521

  3. Using text analysis to quantify the similarity and evolution of scientific disciplines.

    PubMed

    Dias, Laércio; Gerlach, Martin; Scharloth, Joachim; Altmann, Eduardo G

    2018-01-01

    We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance.

  4. The response function of modulated grid Faraday cup plasma instruments

    NASA Technical Reports Server (NTRS)

    Barnett, A.; Olbert, S.

    1986-01-01

    Modulated grid Faraday cup plasma analyzers are a very useful tool for making in situ measurements of space plasmas. One of their great attributes is that their simplicity permits their angular response function to be calculated theoretically. An expression is derived for this response function by computing the trajectories of the charged particles inside the cup. The Voyager Plasma Science (PLS) experiment is used as a specific example. Two approximations to the rigorous response function useful for data analysis are discussed. The theoretical formulas were tested by multi-sensor analysis of solar wind data. The tests indicate that the formulas represent the true cup response function for all angles of incidence with a maximum error of only a few percent.

  5. Engineering brain-computer interfaces: past, present and future.

    PubMed

    Hughes, M A

    2014-06-01

    Electricity governs the function of both nervous systems and computers. Whilst ions move in polar fluids to depolarize neuronal membranes, electrons move in the solid-state lattices of microelectronic semiconductors. Joining these two systems together, to create an iono-electric brain-computer interface, is an immense challenge. However, such interfaces offer (and in select clinical contexts have already delivered) a method of overcoming disability caused by neurological or musculoskeletal pathology. To fulfill their theoretical promise, several specific challenges demand consideration. Rate-limiting steps cover a diverse range of disciplines including microelectronics, neuro-informatics, engineering, and materials science. As those who work at the tangible interface between brain and outside world, neurosurgeons are well placed to contribute to, and inform, this cutting edge area of translational research. This article explores the historical background, status quo, and future of brain-computer interfaces; and outlines the challenges to progress and opportunities available to the clinical neurosciences community.

  6. Toward a Big Data Science: A challenge of "Science Cloud"

    NASA Astrophysics Data System (ADS)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.

  7. Biomolecular computers with multiple restriction enzymes.

    PubMed

    Sakowski, Sebastian; Krasinski, Tadeusz; Waldmajer, Jacek; Sarnik, Joanna; Blasiak, Janusz; Poplawski, Tomasz

    2017-01-01

    The development of conventional, silicon-based computers has several limitations, including some related to the Heisenberg uncertainty principle and the von Neumann "bottleneck". Biomolecular computers based on DNA and proteins are largely free of these disadvantages and, along with quantum computers, are reasonable alternatives to their conventional counterparts in some applications. The idea of a DNA computer proposed by Ehud Shapiro's group at the Weizmann Institute of Science was developed using one restriction enzyme as hardware and DNA fragments (the transition molecules) as software and input/output signals. This computer represented a two-state two-symbol finite automaton that was subsequently extended by using two restriction enzymes. In this paper, we propose the idea of a multistate biomolecular computer with multiple commercially available restriction enzymes as hardware. Additionally, an algorithmic method for the construction of transition molecules in the DNA computer based on the use of multiple restriction enzymes is presented. We use this method to construct multistate, biomolecular, nondeterministic finite automata with four commercially available restriction enzymes as hardware. We also describe an experimental applicaton of this theoretical model to a biomolecular finite automaton made of four endonucleases.

  8. Double photoionization of Be-like (Be-F5+) ions

    NASA Astrophysics Data System (ADS)

    Abdel Naby, Shahin; Pindzola, Michael; Colgan, James

    2015-04-01

    The time-dependent close-coupling method is used to study the single photon double ionization of Be-like (Be - F5+) ions. Energy and angle differential cross sections are calculated to fully investigate the correlated motion of the two photoelectrons. Symmetric and antisymmetric amplitudes are presented along the isoelectronic sequence for different energy sharing of the emitted electrons. Our total double photoionization cross sections are in good agreement with available theoretical results and experimental measurements along the Be-like ions. This work was supported in part by grants from NSF and US DoE. Computational work was carried out at NERSC in Oakland, California and the National Institute for Computational Sciences in Knoxville, Tennessee.

  9. Modelling the spread of innovation in wild birds.

    PubMed

    Shultz, Thomas R; Montrey, Marcel; Aplin, Lucy M

    2017-06-01

    We apply three plausible algorithms in agent-based computer simulations to recent experiments on social learning in wild birds. Although some of the phenomena are simulated by all three learning algorithms, several manifestations of social conformity bias are simulated by only the approximate majority (AM) algorithm, which has roots in chemistry, molecular biology and theoretical computer science. The simulations generate testable predictions and provide several explanatory insights into the diffusion of innovation through a population. The AM algorithm's success raises the possibility of its usefulness in studying group dynamics more generally, in several different scientific domains. Our differential-equation model matches simulation results and provides mathematical insights into the dynamics of these algorithms. © 2017 The Author(s).

  10. Applications of artificial intelligence to scientific research

    NASA Technical Reports Server (NTRS)

    Prince, Mary Ellen

    1986-01-01

    Artificial intelligence (AI) is a growing field which is just beginning to make an impact on disciplines other than computer science. While a number of military and commercial applications were undertaken in recent years, few attempts were made to apply AI techniques to basic scientific research. There is no inherent reason for the discrepancy. The characteristics of the problem, rather than its domain, determines whether or not it is suitable for an AI approach. Expert system, intelligent tutoring systems, and learning programs are examples of theoretical topics which can be applied to certain areas of scientific research. Further research and experimentation should eventurally make it possible for computers to act as intelligent assistants to scientists.

  11. Meta-Theoretical Contributions to the Constitution of a Model-Based Didactics of Science

    NASA Astrophysics Data System (ADS)

    Ariza, Yefrin; Lorenzano, Pablo; Adúriz-Bravo, Agustín

    2016-10-01

    There is nowadays consensus in the community of didactics of science (i.e. science education understood as an academic discipline) regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have identified an ever-increasing use of the concept of `theoretical model', stemming from the so-called semantic view of scientific theories. However, it can be recognised that, in didactics of science, there are over-simplified transpositions of the idea of model (and of other meta-theoretical ideas). In this sense, contemporary philosophy of science is often blurred or distorted in the science education literature. In this paper, we address the discussion around some meta-theoretical concepts that are introduced into didactics of science due to their perceived educational value. We argue for the existence of a `semantic family', and we characterise four different versions of semantic views existing within the family. In particular, we seek to contribute to establishing a model-based didactics of science mainly supported in this semantic family.

  12. Dynamic Information Management and Exchange for Command and Control Applications, Modelling and Enforcing Category-Based Access Control via Term Rewriting

    DTIC Science & Technology

    2015-03-01

    a hotel and a hospital. 2. Event handler for emergency policies (item 2 above): this has been implemented in two UG projects, one project developed a...Workshop on Logical and Se- mantic Frameworks, with Applications, Brasilia, Brazil , September 2014. Electronic Notes in Theoretical Computer Science (to...Brasilia, Brazil , September 2014, 2015. [3] S. Barker. The next 700 access control models or a unifying meta-model? In SACMAT 2009, 14th ACM Symposium on

  13. Proceedings of the Annual Acquisition Research Symposium (7th), Acquisition Research: Creating Synergy for Informed Change 12-13 May 2010. Volume 2

    DTIC Science & Technology

    2010-04-30

    delivered enhances both the teaching and learning processes. • The number of students engaged in focused acquisition research for their MBA projects...Meyers, US Navy—Lieutenant Nicholas Meyers is an MBA student in the Graduate School of Business & Public Policy at the Naval Postgraduate School . LT...Theoretic Computer Science Mathematics and Operations Research Werner Heisenberg-Weg 39 85577 Neubiberg, Germany Phone +49 89 6004 2400 Abstract

  14. US-Latin American Workshop on Molecular and Materials Sciences: Theoretical and Computational Aspects Held at the University of Florida, Gainesville, on February 8-10, 1994

    DTIC Science & Technology

    1994-08-09

    Observables During a Collision Inst. de Fisica , Cuernavaca, Mexico Ruben D. Santiago Acosta An Algebraic Model for 3-dimensional Atom-Diatom Inst C...STRUCTURES. MOLECULAR DYNAMICS SIMULATION M. C .Donnamaria and J. R. Grigera Instituto de Fisica de Liquidos y Sistemas Biologicos (IFLYSIB),CONICET...Crybiology, 1981, 18, 631. ACKNOWLEDGMENTS This work has been partially funded by the Consejo Nacional de Investigaciones Cientificas y Tecnicas (CONICET) of

  15. Final Reports for Contract N00014-87-K-0181 (University of Hawaii, School of Ocean and Earth Science and Technology)

    DTIC Science & Technology

    1994-09-01

    CONTENT A. Administration B. Dynamics of Small-scale Ocean Motions (P. Muller) C. Seismic Anisotropy ( G . Fryer) D. Low Frequency Modulus Measurements...Manghnani G . Marching the Elastodynamic Wave Equation (N. Frazer) H. Theoretical & Computational Studies in Marine Seismology (N. Frazer) I. Correction and...Publication. and in the summary article: Muller, P.,E. D’Asaro and G . Holloway, 1991: Internal Gravity Waves and Mixing. EOS, T:ansactions, American

  16. Solving constant-coefficient differential equations with dielectric metamaterials

    NASA Astrophysics Data System (ADS)

    Zhang, Weixuan; Qu, Che; Zhang, Xiangdong

    2016-07-01

    Recently, the concept of metamaterial analog computing has been proposed (Silva et al 2014 Science 343 160-3). Some mathematical operations such as spatial differentiation, integration, and convolution, have been performed by using designed metamaterial blocks. Motivated by this work, we propose a practical approach based on dielectric metamaterial to solve differential equations. The ordinary differential equation can be solved accurately by the correctly designed metamaterial system. The numerical simulations using well-established numerical routines have been performed to successfully verify all theoretical analyses.

  17. PREFACE: ELC International Meeting on Inference, Computation, and Spin Glasses (ICSG2013)

    NASA Astrophysics Data System (ADS)

    Kabashima, Yoshiyuki; Hukushima, Koji; Inoue, Jun-ichi; Tanaka, Toshiyuki; Watanabe, Osamu

    2013-12-01

    The close relationship between probability-based inference and statistical mechanics of disordered systems has been noted for some time. This relationship has provided researchers with a theoretical foundation in various fields of information processing for analytical performance evaluation and construction of efficient algorithms based on message-passing or Monte Carlo sampling schemes. The ELC International Meeting on 'Inference, Computation, and Spin Glasses (ICSG2013)', was held in Sapporo 28-30 July 2013. The meeting was organized as a satellite meeting of STATPHYS25 in order to offer a forum where concerned researchers can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies between statistical mechanics and information sciences. Financial support from Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan 'Exploring the Limits of Computation (ELC)' is gratefully acknowledged. We are pleased to publish 23 papers contributed by invited speakers of ICSG2013 in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of this highly vigorous interdisciplinary field between statistical mechanics and information/computer science. Editors and ICSG2013 Organizing Committee: Koji Hukushima Jun-ichi Inoue (Local Chair of ICSG2013) Yoshiyuki Kabashima (Editor-in-Chief) Toshiyuki Tanaka Osamu Watanabe (General Chair of ICSG2013)

  18. A phenomenological study on middle-school science teachers' perspectives on utilization of technology in the science classroom and its effect on their pedagogy

    NASA Astrophysics Data System (ADS)

    Rajbanshi, Roshani

    With access to technology and expectation by the mainstream, the use of technology in the classroom has become essential these days. However, the problem in science education is that with classrooms filled with technological equipment, the teaching style is didactic, and teachers employ traditional teacher-centered methods in the classroom. In addition, results of international assessments indicate that students' science learning needs to be improved. The purpose of this study is to analyze and document the lived experience of middle-school science teachers and their use of technology in personal, professional lives as well as in their classroom and to describe the phenomenon of middle-school science teachers' technological beliefs for integration of digital devices or technology as an instructional delivery tool, knowledge construction tool and learning tool. For this study, technology is defined as digital devices such as computer, laptops, digital camera, iPad that are used in the science classroom as an instructional delivery tool, as a learning tool, and as a knowledge construction tool. Constructivism is the lens, the theoretical framework that guides this qualitative phenomenological research. Observation, interview, personal journal, photo elicitation, and journal reflection are used as methods of data collection. Data was analyzed based on a constructivist theoretical framework to construct knowledge and draw conclusion. MAXQDA, a qualitative analysis software, was also used to analyze the data. The findings indicate that middle-school science teachers use technology in various ways to engage and motivate students in science learning; however, there are multiple factors that influence teachers' technology use in the class. In conclusion, teacher, students, and technology are the three sides of the triangle where technology acts as the third side or the bridge to connect teachers' content knowledge to students through the tool with which students are familiar. Keywords: Teachers' belief, science and technology, knowledge construction.

  19. Women's decision to major in STEM fields

    NASA Astrophysics Data System (ADS)

    Conklin, Stephanie

    This paper explores the lived experiences of high school female students who choose to enter into STEM fields, and describes the influencing factors which steered these women towards majors in computer science, engineering and biology. Utilizing phenomenological methodology, this study seeks to understand the essence of women's decisions to enter into STEM fields and further describe how the decision-making process varies for women in high female enrollment fields, like biology, as compared with low enrollment fields like, computer science and engineering. Using Bloom's 3-Stage Theory, this study analyzes how relationships, experiences and barriers influenced women towards, and possibly away, from STEM fields. An analysis of women's experiences highlight that support of family, sustained experience in a STEM program during high school as well as the presence of an influential teacher were all salient factors in steering women towards STEM fields. Participants explained that influential teacher worked individually with them, modified and extended assignments and also steered participants towards coursework and experiences. This study also identifies factors, like guidance counselors as well as personal challenges, which inhibited participant's path to STEM fields. Further, through analyzing all six participants' experiences, it is clear that a linear model, like Bloom's 3-Stage Model, with limited ability to include potential barriers inhibited the ability to capture the essence of each participant's decision-making process. Therefore, a revised model with no linear progression which allows for emerging factors, like personal challenges, has been proposed; this model focuses on how interest in STEM fields begins to develop and is honed and then mastered. This study also sought to identify key differences in the paths of female students pursuing different majors. The findings of this study suggest that the path to computer science and engineering is limited. Computer science majors faced few, if any, challenges, hoped to use computers as a tool to innovate and also participated in the same computer science program. For female engineering students, the essence of their experience focused on interaction at a young age with an expert in an engineering-related field as well as a strong desire to help solve world problems using engineering. These participants were able to articulate clearly future careers. In contrast, biology majors, faced more challenges and were undecided about their future career goals. These results suggest that a longitudinal study focused on women pursuing engineering and computer science fields is warranted; this will hopefully allow these findings to be substantiated and also for refinement of the revised theoretical model.

  20. Meteor Observations as Big Data Citizen Science

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.

    2016-12-01

    Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.

  1. QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations

    NASA Astrophysics Data System (ADS)

    Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas

    2008-10-01

    Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de

  2. Quantum lattice model solver HΦ

    NASA Astrophysics Data System (ADS)

    Kawamura, Mitsuaki; Yoshimi, Kazuyoshi; Misawa, Takahiro; Yamaji, Youhei; Todo, Synge; Kawashima, Naoki

    2017-08-01

    HΦ [aitch-phi ] is a program package based on the Lanczos-type eigenvalue solution applicable to a broad range of quantum lattice models, i.e., arbitrary quantum lattice models with two-body interactions, including the Heisenberg model, the Kitaev model, the Hubbard model and the Kondo-lattice model. While it works well on PCs and PC-clusters, HΦ also runs efficiently on massively parallel computers, which considerably extends the tractable range of the system size. In addition, unlike most existing packages, HΦ supports finite-temperature calculations through the method of thermal pure quantum (TPQ) states. In this paper, we explain theoretical background and user-interface of HΦ. We also show the benchmark results of HΦ on supercomputers such as the K computer at RIKEN Advanced Institute for Computational Science (AICS) and SGI ICE XA (Sekirei) at the Institute for the Solid State Physics (ISSP).

  3. A Diverse Community To Study Communities: Integration of Experiments and Mathematical Models To Study Microbial Consortia.

    PubMed

    Succurro, Antonella; Moejes, Fiona Wanjiku; Ebenhöh, Oliver

    2017-08-01

    The last few years have seen the advancement of high-throughput experimental techniques that have produced an extraordinary amount of data. Bioinformatics and statistical analyses have become instrumental to interpreting the information coming from, e.g., sequencing data and often motivate further targeted experiments. The broad discipline of "computational biology" extends far beyond the well-established field of bioinformatics, but it is our impression that more theoretical methods such as the use of mathematical models are not yet as well integrated into the research studying microbial interactions. The empirical complexity of microbial communities presents challenges that are difficult to address with in vivo / in vitro approaches alone, and with microbiology developing from a qualitative to a quantitative science, we see stronger opportunities arising for interdisciplinary projects integrating theoretical approaches with experiments. Indeed, the addition of in silico experiments, i.e., computational simulations, has a discovery potential that is, unfortunately, still largely underutilized and unrecognized by the scientific community. This minireview provides an overview of mathematical models of natural ecosystems and emphasizes that one critical point in the development of a theoretical description of a microbial community is the choice of problem scale. Since this choice is mostly dictated by the biological question to be addressed, in order to employ theoretical models fully and successfully it is vital to implement an interdisciplinary view at the conceptual stages of the experimental design. Copyright © 2017 Succurro et al.

  4. Gordian Knots of Prevision: The lessons of history

    NASA Astrophysics Data System (ADS)

    Fleming, J. R.

    2017-12-01

    Atmospheric researchers have long attempted to untie the Gordian Knot of meteorology—that intractable and intertwined tangle of observational imprecision, theoretical uncertainties, and non-linear influences—that, if unraveled, would provide perfect prevision of the weather for ten days, of seasonal conditions for the year, and of climatic conditions for a decade, a century, a millennium, or longer. This presentation, based on Inventing Atmospheric Science (M.I.T. Press, 2016), examines the work of four interconnected generations of scientists (Vilhelm Bjerknes, C.-G. Rossby, Harry Wexler, Ed Lorenz) and the influence of four transformative technologies (radio, nuclear, computation, aerospace) from the dawn of applied fluid dynamics to the emergence of the interdisciplinary atmospheric sciences and the new Gordian Knot of chaos.

  5. The National Cancer Institute's Physical Sciences - Oncology Network

    NASA Astrophysics Data System (ADS)

    Espey, Michael Graham

    In 2009, the NCI launched the Physical Sciences - Oncology Centers (PS-OC) initiative with 12 Centers (U54) funded through 2014. The current phase of the Program includes U54 funded Centers with the added feature of soliciting new Physical Science - Oncology Projects (PS-OP) U01 grant applications through 2017; see NCI PAR-15-021. The PS-OPs, individually and along with other PS-OPs and the Physical Sciences-Oncology Centers (PS-OCs), comprise the Physical Sciences-Oncology Network (PS-ON). The foundation of the Physical Sciences-Oncology initiative is a high-risk, high-reward program that promotes a `physical sciences perspective' of cancer and fosters the convergence of physical science and cancer research by forming transdisciplinary teams of physical scientists (e.g., physicists, mathematicians, chemists, engineers, computer scientists) and cancer researchers (e.g., cancer biologists, oncologists, pathologists) who work closely together to advance our understanding of cancer. The collaborative PS-ON structure catalyzes transformative science through increased exchange of people, ideas, and approaches. PS-ON resources are leveraged to fund Trans-Network pilot projects to enable synergy and cross-testing of experimental and/or theoretical concepts. This session will include a brief PS-ON overview followed by a strategic discussion with the APS community to exchange perspectives on the progression of trans-disciplinary physical sciences in cancer research.

  6. PREFACE: New trends in Computer Simulations in Physics and not only in physics

    NASA Astrophysics Data System (ADS)

    Shchur, Lev N.; Krashakov, Serge A.

    2016-02-01

    In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf

  7. The Theoretical Astrophysical Observatory: Cloud-based Mock Galaxy Catalogs

    NASA Astrophysics Data System (ADS)

    Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah

    2016-03-01

    We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.

  8. ISMB Conference Funding to Support Attendance of Early Researchers and Students

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaasterland, Terry

    ISMB Conference Funding for Students and Young Scientists Historical Description The Intelligent Systems for Molecular Biology (ISMB) conference has provided a general forum for disseminating the latest developments in bioinformatics on an annual basis for the past 22 years. ISMB is a multidisciplinary conference that brings together scientists from computer science, molecular biology, mathematics and statistics. The goal of the ISMB meeting is to bring together biologists and computational scientists in a focus on actual biological problems, i.e., not simply theoretical calculations. The combined focus on “intelligent systems” and actual biological data makes ISMB a unique and highly important meeting.more » 21 years of experience in holding the conference has resulted in a consistently well-organized, well attended, and highly respected annual conference. "Intelligent systems" include any software which goes beyond straightforward, closed-form algorithms or standard database technologies, and encompasses those that view data in a symbolic fashion, learn from examples, consolidate multiple levels of abstraction, or synthesize results to be cognitively tractable to a human, including the development and application of advanced computational methods for biological problems. Relevant computational techniques include, but are not limited to: machine learning, pattern recognition, knowledge representation, databases, combinatorics, stochastic modeling, string and graph algorithms, linguistic methods, robotics, constraint satisfaction, and parallel computation. Biological areas of interest include molecular structure, genomics, molecular sequence analysis, evolution and phylogenetics, molecular interactions, metabolic pathways, regulatory networks, developmental control, and molecular biology generally. Emphasis is placed on the validation of methods using real data sets, on practical applications in the biological sciences, and on development of novel computational techniques. The ISMB conferences are distinguished from many other conferences in computational biology or artificial intelligence by an insistence that the researchers work with real molecular biology data, not theoretical or toy examples; and from many other biological conferences by providing a forum for technical advances as they occur, which otherwise may be shunned until a firm experimental result is published. The resulting intellectual richness and cross-disciplinary diversity provides an important opportunity for both students and senior researchers. ISMB has become the premier conference series in this field with refereed, published proceedings, establishing an infrastructure to promote the growing body of research.« less

  9. Transformation in the pharmaceutical industry--a systematic review of the literature.

    PubMed

    Shafiei, Nader; Ford, James L; Morecroft, Charles W; Lisboa, Paulo J; Taylor, Mark J; Mouzughi, Yusra

    2013-01-01

    The evolutionary development of pharmaceutical transformation was studied through systematic review of the literature. Fourteen triggers were identified that will affect the pharmaceutical business, regulatory science, and enabling technologies in future years. The relative importance ranking of the transformation triggers was computed based on their prevalence within the articles studied. The four main triggers with the strongest literature evidence were Fully Integrated Pharma Network, Personalized Medicine, Translational Research, and Pervasive Computing. The theoretical quality risks for each of the four main transformation triggers are examined, and the remaining ten triggers are described. The pharmaceutical industry is currently going through changes that affect the way it performs its research, manufacturing, and regulatory activities (this is termed pharmaceutical transformation). The impact of these changes on the approaches to quality risk management requires more understanding. In this paper, a comprehensive review of the academic, regulatory, and industry literature were used to identify 14 triggers that influence pharmaceutical transformation. The four main triggers, namely Fully Integrated Pharma Network, Personalized Medicine, Translational Research, and Pervasive Computing, were selected as the most important based on the strength of the evidence found during the literature review activity described in this paper. Theoretical quality risks for each of the four main transformation triggers are examined, and the remaining ten triggers are described.

  10. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  11. A decision-theoretic approach to the display of information for time-critical decisions: The Vista project

    NASA Technical Reports Server (NTRS)

    Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath; Barry, Matthew

    1993-01-01

    We describe a collaborative research and development effort between the Palo Alto Laboratory of the Rockwell Science Center, Rockwell Space Operations Company, and the Propulsion Systems Section of NASA JSC to design computational tools that can manage the complexity of information displayed to human operators in high-stakes, time-critical decision contexts. We shall review an application from NASA Mission Control and describe how we integrated a probabilistic diagnostic model and a time-dependent utility model, with techniques for managing the complexity of computer displays. Then, we shall describe the behavior of VPROP, a system constructed to demonstrate promising display-management techniques. Finally, we shall describe our current research directions on the Vista 2 follow-on project.

  12. High Performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions

    DTIC Science & Technology

    2016-08-30

    High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions A dedicated high-performance computer cluster was...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Computer cluster ...peer-reviewed journals: Final Report: High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions Report Title A dedicated

  13. The Materials Genome Project

    NASA Astrophysics Data System (ADS)

    Aourag, H.

    2008-09-01

    In the past, the search for new and improved materials was characterized mostly by the use of empirical, trial- and-error methods. This picture of materials science has been changing as the knowledge and understanding of fundamental processes governing a material's properties and performance (namely, composition, structure, history, and environment) have increased. In a number of cases, it is now possible to predict a material's properties before it has even been manufactured thus greatly reducing the time spent on testing and development. The objective of modern materials science is to tailor a material (starting with its chemical composition, constituent phases, and microstructure) in order to obtain a desired set of properties suitable for a given application. In the short term, the traditional "empirical" methods for developing new materials will be complemented to a greater degree by theoretical predictions. In some areas, computer simulation is already used by industry to weed out costly or improbable synthesis routes. Can novel materials with optimized properties be designed by computers? Advances in modelling methods at the atomic level coupled with rapid increases in computer capabilities over the last decade have led scientists to answer this question with a resounding "yes'. The ability to design new materials from quantum mechanical principles with computers is currently one of the fastest growing and most exciting areas of theoretical research in the world. The methods allow scientists to evaluate and prescreen new materials "in silico" (in vitro), rather than through time consuming experimentation. The Materials Genome Project is to pursue the theory of large scale modeling as well as powerful methods to construct new materials, with optimized properties. Indeed, it is the intimate synergy between our ability to predict accurately from quantum theory how atoms can be assembled to form new materials and our capacity to synthesize novel materials atom-by-atom that gives to the Materials Genome Project its extraordinary intellectual vitality. Consequently, in designing new materials through computer simulation, our primary objective is to rapidly screen possible designs to find those few that will enhance the competitiveness of industries or have positive benefits to society. Examples include screening of cancer drugs, advances in catalysis for energy production, design of new alloys and multilayers and processing of semiconductors.

  14. Highly Active and Stable MgAl2O4 Supported Rh and Ir Catalysts for Methane Steam Reforming: A Combined Experimental and Theoretical Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Donghai; Glezakou, Vassiliki Alexandra; Lebarbier, Vanessa MC

    2014-07-01

    In this work we present a combined experimental and theoretical investigation of stable MgAl2O4 spinel-supported Rh and Ir catalysts for the steam methane reforming (SMR) reaction. Firstly, catalytic performance for a series of noble metal catalysts supported on MgAl2O4 spinel was evaluated for SMR at 600-850°C. Turnover rate at 850°C follows the order: Pd > Pt > Ir > Rh > Ru > Ni. However, Rh and Ir were found to have the best combination of activity and stability for methane steam reforming in the presence of simulated biomass-derived syngas. It was found that highly dispersed ~2 nm Rh andmore » ~1 nm Ir clusters were formed on the MgAl2O4 spinel support. Scanning Transition Electron Microscopy (STEM) images show that excellent dispersion was maintained even under challenging high temperature conditions (e.g. at 850°C in the presence of steam) while Ir and Rh catalysts supported on Al2O3 were observed to sinter at increased rates under the same conditions. These observations were further confirmed by ab initio molecular dynamics (AIMD) simulations which find that ~1 nm Rh and Ir particles (50-atom cluster) bind strongly to the MgAl2O4 surfaces via a redox process leading to a strong metal-support interaction, thus helping anchor the metal clusters and reduce the tendency to sinter. Density functional theory (DFT) calculations suggest that these supported smaller Rh and Ir particles have a lower work function than larger more bulk-like ones, which enables them to activate both water and methane more effectively than larger particles, yet have a minimal influence on the relative stability of coke precursors. In addition, theoretical mechanistic studies were used to probe the relationship between structure and reactivity. Consistent with the experimental observations, our theoretical modeling results also suggest that the small spinel-supported Ir particle catalyst is more active than the counterpart of Rh catalyst for SMR. This work was financially supported by the United States Department of Energy (DOE)’s Bioenergy Technologies Office (BETO) and performed at the Pacific Northwest National Laboratory (PNNL). PNNL is a multi-program national laboratory operated for DOE by Battelle Memorial Institute. Computing time was granted by a user proposal at the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) located at PNNL. Part of the computational time was provided by the National Energy Research Scientific Computing Center (NERSC).« less

  15. Perceptions of teaching and learning automata theory in a college-level computer science course

    NASA Astrophysics Data System (ADS)

    Weidmann, Phoebe Kay

    This dissertation identifies and describes student and instructor perceptions that contribute to effective teaching and learning of Automata Theory in a competitive college-level Computer Science program. Effective teaching is the ability to create an appropriate learning environment in order to provide effective learning. We define effective learning as the ability of a student to meet instructor set learning objectives, demonstrating this by passing the course, while reporting a good learning experience. We conducted our investigation through a detailed qualitative case study of two sections (118 students) of Automata Theory (CS 341) at The University of Texas at Austin taught by Dr. Lily Quilt. Because Automata Theory has a fixed curriculum in the sense that many curricula and textbooks agree on what Automata Theory contains, differences being depth and amount of material to cover in a single course, a case study would allow for generalizable findings. Automata Theory is especially problematic in a Computer Science curriculum since students are not experienced in abstract thinking before taking this course, fail to understand the relevance of the theory, and prefer classes with more concrete activities such as programming. This creates a special challenge for any instructor of Automata Theory as motivation becomes critical for student learning. Through the use of student surveys, instructor interviews, classroom observation, material and course grade analysis we sought to understand what students perceived, what instructors expected of students, and how those perceptions played out in the classroom in terms of structure and instruction. Our goal was to create suggestions that would lead to a better designed course and thus a higher student success rate in Automata Theory. We created a unique theoretical basis, pedagogical positivism, on which to study college-level courses. Pedagogical positivism states that through examining instructor and student perceptions of teaching and learning, improvements to a course are possible. These improvements can eventually develop a "best practice" instructional environment. This view is not possible under a strictly constructivist learning theory as there is no way to teach a group of individuals in a "best" way. Using this theoretical basis, we examined the gathered data from CS 341. (Abstract shortened by UMI.)

  16. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe lattice at zero temperature and then we apply this formalism to the K-SAT problem defined in Chapter 1. The phase transition which physicists study often corresponds to a change in the computational complexity of the corresponding computer science problem. Chapter 3 presents phase transitions which are specific to the problems discussed in Chapter 1 and also known results for the K-SAT problem. We discuss the replica method and experimental evidences of replica symmetry breaking. The physics approach to hard problems is based on replica methods which are difficult to understand. In Chapter 4 we develop novel methods for studying hard problems using methods similar to the message passing techniques that were discussed in Chapter 2. Although we concentrated on the symmetric case, cavity methods show promise for generalizing our methods to the un-symmetric case. As has been highlighted by John Hopfield, several key features of biological systems are not shared by physical systems. Although living entities follow the laws of physics and chemistry, the fact that organisms adapt and reproduce introduces an essential ingredient that is missing in the physical sciences. In order to extract information from networks many algorithm have been developed. In Chapter 5 we apply polynomial algorithms like minimum spanning tree in order to study and construct gene regulatory networks from experimental data. As future work we propose the use of algorithms like min-cut/max-flow and Dijkstra for understanding key properties of these networks.

  17. Biomolecular computers with multiple restriction enzymes

    PubMed Central

    Sakowski, Sebastian; Krasinski, Tadeusz; Waldmajer, Jacek; Sarnik, Joanna; Blasiak, Janusz; Poplawski, Tomasz

    2017-01-01

    Abstract The development of conventional, silicon-based computers has several limitations, including some related to the Heisenberg uncertainty principle and the von Neumann “bottleneck”. Biomolecular computers based on DNA and proteins are largely free of these disadvantages and, along with quantum computers, are reasonable alternatives to their conventional counterparts in some applications. The idea of a DNA computer proposed by Ehud Shapiro’s group at the Weizmann Institute of Science was developed using one restriction enzyme as hardware and DNA fragments (the transition molecules) as software and input/output signals. This computer represented a two-state two-symbol finite automaton that was subsequently extended by using two restriction enzymes. In this paper, we propose the idea of a multistate biomolecular computer with multiple commercially available restriction enzymes as hardware. Additionally, an algorithmic method for the construction of transition molecules in the DNA computer based on the use of multiple restriction enzymes is presented. We use this method to construct multistate, biomolecular, nondeterministic finite automata with four commercially available restriction enzymes as hardware. We also describe an experimental applicaton of this theoretical model to a biomolecular finite automaton made of four endonucleases. PMID:29064510

  18. Motivating students' participation in a computer networks course by means of magic, drama and games.

    PubMed

    Hilas, Constantinos S; Politis, Anastasios

    2014-01-01

    The recent economic crisis has forced many universities to cut down expenses by packing students into large lecture groups. The problem with large auditoria is that they discourage dialogue between students and faculty and they burden participation. Adding to this, students in computer science courses usually find the field to be full of theoretical and technical concepts. Lack of understanding leads them to lose interest and / or motivation. Classroom experience shows that the lecturer could employ alternative teaching methods, especially for early-year undergraduate students, in order to grasp their interest and introduce basic concepts. This paper describes some of the approaches that may be used to keep students interested and make them feel comfortable as they comprehend basic concepts in computer networks. The lecturing procedure was enriched with games, magic tricks and dramatic representations. This approach was used experimentally for two semesters and the results were more than encouraging.

  19. How the World Gains Understanding of a Planet: Analysis of Scientific Understanding in Earth Sciences and of the Communication of Earth-Scientific Explanation

    NASA Astrophysics Data System (ADS)

    Voute, S.; Kleinhans, M. G.; de Regt, H.

    2010-12-01

    A scientific explanation for a phenomenon is based on relevant theory and initial and background conditions. Scientific understanding, on the other hand, requires intelligibility, which means that a scientist can recognise qualitative characteristic consequences of the theory without doing the actual calculations, and apply it to develop further explanations and predictions. If explanation and understanding are indeed fundamentally different, then it may be possible to convey understanding of earth-scientific phenomena to laymen without the full theoretical background. The aim of this thesis is to analyze how scientists and laymen gain scientific understanding in Earth Sciences, based on the newest insights in the philosophy of science, pedagogy, and science communication. All three disciplines have something to say about how humans learn and understand, even if at very different levels of scientists, students, children or the general public. If different disciplines with different approaches identify and quantify the same theory in the same manner, then there is likely to be something “real” behind the theory. Comparing methodology and learning styles of the different disciplines within the Earth Sciences and by critically analyze earth-scientific exhibitions in different museums may provide insight in the different approaches for earth-scientific explanation and communication. In order to gain earth-scientific understanding, a broad suite of tools is used, such as maps and images, symbols and diagrams, cross-sections and sketches, categorization and classification, modelling, laboratory experiments, (computer) simulations and analogies, remote sensing, and fieldwork. All these tools have a dual nature, containing both theoretical and embodied components. Embodied knowledge is created by doing the actual modelling, intervening in experiments and doing fieldwork. Scientific practice includes discovery and exploration, data collection and analyses, verification or falsification and conclusions that must be well grounded and argued. The intelligibility of theories is improved by the combination of these two types of understanding. This is also attested by the fact that both theoretical and embodied skills are considered essential for the training of university students at all levels. However, from surprised and confounded reactions of the public to natural disasters it appears that just showing scientific results is not enough to convey the scientific understanding to the public. By using the tools used by earth scientists to develop explanations and achieve understanding, laymen could achieve understanding as well without rigorous theoretical training. We are presently investigating in science musea whether engaging the public in scientific activities based on embodied skills leads to understanding of earth-scientific phenomena by laymen.

  20. International Conference on Intelligent Systems for Molecular Biology (ISMB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Debra; Hibbs, Matthew; Kall, Lukas

    The Intelligent Systems for Molecular Biology (ISMB) conference has provided a general forum for disseminating the latest developments in bioinformatics on an annual basis for the past 13 years. ISMB is a multidisciplinary conference that brings together scientists from computer science, molecular biology, mathematics and statistics. The goal of the ISMB meeting is to bring together biologists and computational scientists in a focus on actual biological problems, i.e., not simply theoretical calculations. The combined focus on "intelligent systems" and actual biological data makes ISMB a unique and highly important meeting, and 13 years of experience in holding the conference hasmore » resulted in a consistently well organized, well attended, and highly respected annual conference. The ISMB 2005 meeting was held June 25-29, 2005 at the Renaissance Center in Detroit, Michigan. The meeting attracted over 1,730 attendees. The science presented was exceptional, and in the course of the five-day meeting, 56 scientific papers, 710 posters, 47 Oral Abstracts, 76 Software demonstrations, and 14 tutorials were presented. The attendees represented a broad spectrum of backgrounds with 7% from commercial companies, over 28% qualifying for student registration, and 41 countries were represented at the conference, emphasizing its important international aspect. The ISMB conference is especially important because the cultures of computer science and biology are so disparate. ISMB, as a full-scale technical conference with refereed proceedings that have been indexed by both MEDLINE and Current Contents since 1996, bridges this cultural gap.« less

  1. Modeling of scattering from ice surfaces

    NASA Astrophysics Data System (ADS)

    Dahlberg, Michael Ross

    Theoretical research is proposed to study electromagnetic wave scattering from ice surfaces. A mathematical formulation that is more representative of the electromagnetic scattering from ice, with volume mechanisms included, and capable of handling multiple scattering effects is developed. This research is essential to advancing the field of environmental science and engineering by enabling more accurate inversion of remote sensing data. The results of this research contributed towards a more accurate representation of the scattering from ice surfaces, that is computationally more efficient and that can be applied to many remote-sensing applications.

  2. On quantum models of the human mind.

    PubMed

    Wang, Hongbin; Sun, Yanlong

    2014-01-01

    Recent years have witnessed rapidly increasing interests in developing quantum theoretical models of human cognition. Quantum mechanisms have been taken seriously to describe how the mind reasons and decides. Papers in this special issue report the newest results in the field. Here we discuss why the two levels of commitment, treating the human brain as a quantum computer and merely adopting abstract quantum probability principles to model human cognition, should be integrated. We speculate that quantum cognition models gain greater modeling power due to a richer representation scheme. Copyright © 2013 Cognitive Science Society, Inc.

  3. Proceedings of US-Latin American Workshop on Molecular and Materials Sciences: Theoretical and Computational Aspects Held in Gainesville, Florida on 10-12 March 1993

    DTIC Science & Technology

    1994-08-09

    City Josd Rdamier and R. Iterative Bogoliubov Transformations and Applications Jauregui Inst. de Fisica , Cuernavaca, Meiico Peter J. Reynolds Quantum...University, Provo, UT 84602 J. R•camier Instituto de Fisica , UNAM, Cuernavaca, Mor., MWxico Gamow states are solutions to the Schr6dinger equation with a...Coutinho Departamento de FAsica do UFPE 50732-910 Recife, PE, Brazil and Jnstituto de Fisica da USP, Czp 20516 01498-970 Sdo Paulo, SP, Brasil The

  4. The theoretical cognitive process of visualization for science education.

    PubMed

    Mnguni, Lindelani E

    2014-01-01

    The use of visual models such as pictures, diagrams and animations in science education is increasing. This is because of the complex nature associated with the concepts in the field. Students, especially entrant students, often report misconceptions and learning difficulties associated with various concepts especially those that exist at a microscopic level, such as DNA, the gene and meiosis as well as those that exist in relatively large time scales such as evolution. However the role of visual literacy in the construction of knowledge in science education has not been investigated much. This article explores the theoretical process of visualization answering the question "how can visual literacy be understood based on the theoretical cognitive process of visualization in order to inform the understanding, teaching and studying of visual literacy in science education?" Based on various theories on cognitive processes during learning for science and general education the author argues that the theoretical process of visualization consists of three stages, namely, Internalization of Visual Models, Conceptualization of Visual Models and Externalization of Visual Models. The application of this theoretical cognitive process of visualization and the stages of visualization in science education are discussed.

  5. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  6. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  7. Research on application of intelligent computation based LUCC model in urbanization process

    NASA Astrophysics Data System (ADS)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents of complexity science research and the conception of complexity feature to reveal the complexity features of LUCC research in urbanization process. Urban space system is a complex economic and cultural phenomenon as well as a social process, is the comprehensive characterization of urban society, economy and culture, and is a complex space system formed by society, economy and nature. It has dissipative structure characteristics, such as opening, dynamics, self-organization, non-balance etc. Traditional model cannot simulate these social, economic and natural driving forces of LUCC including main feedback relation from LUCC to driving force. 2. Establishment of Markov extended model of LUCC analog research in urbanization process. Firstly, use traditional LUCC research model to compute change speed of regional land use through calculating dynamic degree, exploitation degree and consumption degree of land use; use the theory of fuzzy set to rewrite the traditional Markov model, establish structure transfer matrix of land use, forecast and analyze dynamic change and development trend of land use, and present noticeable problems and corresponding measures in urbanization process according to research results. 3. Application of intelligent computation research and complexity science research method in LUCC analog model in urbanization process. On the basis of detailed elaboration of the theory and the model of LUCC research in urbanization process, analyze the problems of existing model used in LUCC research (namely, difficult to resolve many complexity phenomena in complex urban space system), discuss possible structure realization forms of LUCC analog research in combination with the theories of intelligent computation and complexity science research. Perform application analysis on BP artificial neural network and genetic algorithms of intelligent computation and CA model and MAS technology of complexity science research, discuss their theoretical origins and their own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.

  8. Learning from Massive Distributed Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Kang, E. L.; Braverman, A. J.

    2013-12-01

    Technologies for remote sensing and ever-expanding computer experiments in climate science are generating massive data sets. Meanwhile, it has been common in all areas of large-scale science to have these 'big data' distributed over multiple different physical locations, and moving large amounts of data can be impractical. In this talk, we will discuss efficient ways for us to summarize and learn from distributed data. We formulate a graphical model to mimic the main characteristics of a distributed-data network, including the size of the data sets and speed of moving data. With this nominal model, we investigate the trade off between prediction accurate and cost of data movement, theoretically and through simulation experiments. We will also discuss new implementations of spatial and spatio-temporal statistical methods optimized for distributed data.

  9. Towards a truer multicultural science education: how whiteness impacts science education

    NASA Astrophysics Data System (ADS)

    Le, Paul T.; Matias, Cheryl E.

    2018-03-01

    The hope for multicultural, culturally competent, and diverse perspectives in science education falls short if theoretical considerations of whiteness are not entertained. Since whiteness is characterized as a hegemonic racial dominance that has become so natural it is almost invisible, this paper identifies how whiteness operates in science education such that it falls short of its goal for cultural diversity. Because literature in science education has yet to fully entertain whiteness ideology, this paper offers one of the first theoretical postulations. Drawing from the fields of education, legal studies, and sociology, this paper employs critical whiteness studies as both a theoretical lens and an analytic tool to re-interpret how whiteness might impact science education. Doing so allows the field to reconsider benign, routine, or normative practices and protocol that may influence how future scientists of Color experience the field. In sum, we seek to have the field consider the theoretical frames of whiteness and how it might influence how we engage in science education such that our hope for diversity never fully materializes.

  10. The Biological Relevance of Artificial Life: Lessons from Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Colombano, Silvano

    2000-01-01

    There is no fundamental reason why A-life couldn't simply be a branch of computer science that deals with algorithms that are inspired by, or emulate biological phenomena. However, if these are the limits we place on this field, we miss the opportunity to help advance Theoretical Biology and to contribute to a deeper understanding of the nature of life. The history of Artificial Intelligence provides a good example, in that early interest in the nature of cognition quickly was lost to the process of building tools, such as "expert systems" that, were certainly useful, but provided little insight in the nature of cognition. Based on this lesson, I will discuss criteria for increasing the biological relevance of A-life and the probability that this field may provide a theoretical foundation for Biology.

  11. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes

    PubMed Central

    2017-01-01

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274, 1926–1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105, 2745–2750; Thiessen & Yee 2010 Child Development 81, 1287–1303; Saffran 2002 Journal of Memory and Language 47, 172–196; Misyak & Christiansen 2012 Language Learning 62, 302–331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39, 246–263; Thiessen et al. 2013 Psychological Bulletin 139, 792–814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik 2013 Cognitive Science 37, 310–343). This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences'. PMID:27872374

  12. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

    PubMed

    Thiessen, Erik D

    2017-01-05

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik 2013 Cognitive Science 37: , 310-343).This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  13. Theoretical basis, experimental design, and computerized simulation of synergism and antagonism in drug combination studies.

    PubMed

    Chou, Ting-Chao

    2006-09-01

    The median-effect equation derived from the mass-action law principle at equilibrium-steady state via mathematical induction and deduction for different reaction sequences and mechanisms and different types of inhibition has been shown to be the unified theory for the Michaelis-Menten equation, Hill equation, Henderson-Hasselbalch equation, and Scatchard equation. It is shown that dose and effect are interchangeable via defined parameters. This general equation for the single drug effect has been extended to the multiple drug effect equation for n drugs. These equations provide the theoretical basis for the combination index (CI)-isobologram equation that allows quantitative determination of drug interactions, where CI < 1, = 1, and > 1 indicate synergism, additive effect, and antagonism, respectively. Based on these algorithms, computer software has been developed to allow automated simulation of synergism and antagonism at all dose or effect levels. It displays the dose-effect curve, median-effect plot, combination index plot, isobologram, dose-reduction index plot, and polygonogram for in vitro or in vivo studies. This theoretical development, experimental design, and computerized data analysis have facilitated dose-effect analysis for single drug evaluation or carcinogen and radiation risk assessment, as well as for drug or other entity combinations in a vast field of disciplines of biomedical sciences. In this review, selected examples of applications are given, and step-by-step examples of experimental designs and real data analysis are also illustrated. The merging of the mass-action law principle with mathematical induction-deduction has been proven to be a unique and effective scientific method for general theory development. The median-effect principle and its mass-action law based computer software are gaining increased applications in biomedical sciences, from how to effectively evaluate a single compound or entity to how to beneficially use multiple drugs or modalities in combination therapies.

  14. Overview of Aro Program on Network Science for Human Decision Making

    NASA Astrophysics Data System (ADS)

    West, Bruce J.

    This program brings together researchers from disparate disciplines to work on a complex research problem that defies confinement within any single discipline. Consequently, not only are new and rewarding solutions sought and obtained for a problem of importance to society and the Army, that is, the human dimension of complex networks, but, in addition, collaborations are established that would not otherwise have formed given the traditional disciplinary compartmentalization of research. This program develops the basic research foundation of a science of networks supporting the linkage between the physical and human (cognitive and social) domains as they relate to human decision making. The strategy is to extend the recent methods of non-equilibrium statistical physics to non-stationary, renewal stochastic processes that appear to be characteristic of the interactions among nodes in complex networks. We also pursue understanding of the phenomenon of synchronization, whose mathematical formulation has recently provided insight into how complex networks reach accommodation and cooperation. The theoretical analyses of complex networks, although mathematically rigorous, often elude analytic solutions and require computer simulation and computation to analyze the underlying dynamic process.

  15. Integration science and distributed networks

    NASA Astrophysics Data System (ADS)

    Landauer, Christopher; Bellman, Kirstie L.

    2002-07-01

    Our work on integration of data and knowledge sources is based in a common theoretical treatment of 'Integration Science', which leads to systematic processes for combining formal logical and mathematical systems, computational and physical systems, and human systems and organizations. The theory is based on the processing of explicit meta-knowledge about the roles played by the different knowledge sources and the methods of analysis and semantic implications of the different data values, together with information about the context in which and the purpose for which they are being combined. The research treatment is primarily mathematical, and though this kind of integration mathematics is still under development, there are some applicable common threads that have emerged already. Instead of describing the current state of the mathematical investigations, since they are not yet crystallized enough for formalisms, we describe our applications of the approach in several different areas, including our focus area of 'Constructed Complex Systems', which are complex heterogeneous systems managed or mediated by computing systems. In this context, it is important to remember that all systems are embedded, all systems are autonomous, and that all systems are distributed networks.

  16. Theoretical Bases of Science Education Research.

    ERIC Educational Resources Information Center

    Good, Ronald; And Others

    This symposium examines the science education research enterprise from multiple theoretical perspectives. The first paper, "Contextual Constructivism; The Impact of Culture on the Learning and Teaching of Science (William Cobern), focuses on broad issues of culture and how constructivism is affected by the context of culture. Culturally based…

  17. Theoretical Analysis of the Mechanism of Fracture Network Propagation with Stimulated Reservoir Volume (SRV) Fracturing in Tight Oil Reservoirs.

    PubMed

    Su, Yuliang; Ren, Long; Meng, Fankun; Xu, Chen; Wang, Wendong

    2015-01-01

    Stimulated reservoir volume (SRV) fracturing in tight oil reservoirs often induces complex fracture-network growth, which has a fundamentally different formation mechanism from traditional planar bi-winged fracturing. To reveal the mechanism of fracture network propagation, this paper employs a modified displacement discontinuity method (DDM), mechanical mechanism analysis and initiation and propagation criteria for the theoretical model of fracture network propagation and its derivation. A reasonable solution of the theoretical model for a tight oil reservoir is obtained and verified by a numerical discrete method. Through theoretical calculation and computer programming, the variation rules of formation stress fields, hydraulic fracture propagation patterns (FPP) and branch fracture propagation angles and pressures are analyzed. The results show that during the process of fracture propagation, the initial orientation of the principal stress deflects, and the stress fields at the fracture tips change dramatically in the region surrounding the fracture. Whether the ideal fracture network can be produced depends on the geological conditions and on the engineering treatments. This study has both theoretical significance and practical application value by contributing to a better understanding of fracture network propagation mechanisms in unconventional oil/gas reservoirs and to the improvement of the science and design efficiency of reservoir fracturing.

  18. Theoretical Analysis of the Mechanism of Fracture Network Propagation with Stimulated Reservoir Volume (SRV) Fracturing in Tight Oil Reservoirs

    PubMed Central

    Su, Yuliang; Ren, Long; Meng, Fankun; Xu, Chen; Wang, Wendong

    2015-01-01

    Stimulated reservoir volume (SRV) fracturing in tight oil reservoirs often induces complex fracture-network growth, which has a fundamentally different formation mechanism from traditional planar bi-winged fracturing. To reveal the mechanism of fracture network propagation, this paper employs a modified displacement discontinuity method (DDM), mechanical mechanism analysis and initiation and propagation criteria for the theoretical model of fracture network propagation and its derivation. A reasonable solution of the theoretical model for a tight oil reservoir is obtained and verified by a numerical discrete method. Through theoretical calculation and computer programming, the variation rules of formation stress fields, hydraulic fracture propagation patterns (FPP) and branch fracture propagation angles and pressures are analyzed. The results show that during the process of fracture propagation, the initial orientation of the principal stress deflects, and the stress fields at the fracture tips change dramatically in the region surrounding the fracture. Whether the ideal fracture network can be produced depends on the geological conditions and on the engineering treatments. This study has both theoretical significance and practical application value by contributing to a better understanding of fracture network propagation mechanisms in unconventional oil/gas reservoirs and to the improvement of the science and design efficiency of reservoir fracturing. PMID:25966285

  19. Theoretical, Experimental, and Computational Evaluation of Several Vane-Type Slow-Wave Structures

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Qureshi, A. Haq

    1994-01-01

    Several types of periodic vane slow-wave structures were fabricated. The dispersion characteristics were found by theoretical analysis, experimental testing, and computer simulation using the MAFIA code. Computer-generated characteristics agreed to approximately within 2 percent of the experimental characteristics for all structures. The theoretical characteristics, however, deviated increasingly as the width to height ratio became smaller. Interaction impedances were also computed based on the experimental and computer-generated resonance frequency shifts due to the introduction of a perturbing dielectric rod.

  20. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  1. Optoelectronics for electrical and computer engineering students

    NASA Astrophysics Data System (ADS)

    Chua, Soo-Jin; Jalil, Mansoor

    2002-05-01

    We describe the contents of an advanced undergraduate course on Optoelectronics at the Department of Electrical and Computer Engineering, National University of Singapore. The emphasis has changed over the years to keep abreast of the development in the field but the broad features remain the same. A multidisciplinary approach is taken, incorporating physics, materials science and engineering concepts to explain the operation of optoelectronic components, and their application in display, communications and consumer electronics. The course comprises of 36 hours of lectures and two experiments, and covers basic radiometry and photometry, photoemitters (LEDs and lasers), photodetectors, and liquid crystal displays. The main aim of the course is to equip the student with the requisite theoretical and practical knowledge for participation in the photonics industry and for postgraduate research for students who are so inclined.

  2. Physics through the 1990s: Scientific interfaces and technological applications

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.

  3. Hidden in the Middle: Culture, Value and Reward in Bioinformatics.

    PubMed

    Lewis, Jamie; Bartlett, Andrew; Atkinson, Paul

    2016-01-01

    Bioinformatics - the so-called shotgun marriage between biology and computer science - is an interdiscipline. Despite interdisciplinarity being seen as a virtue, for having the capacity to solve complex problems and foster innovation, it has the potential to place projects and people in anomalous categories. For example, valorised 'outputs' in academia are often defined and rewarded by discipline. Bioinformatics, as an interdisciplinary bricolage, incorporates experts from various disciplinary cultures with their own distinct ways of working. Perceived problems of interdisciplinarity include difficulties of making explicit knowledge that is practical, theoretical, or cognitive. But successful interdisciplinary research also depends on an understanding of disciplinary cultures and value systems, often only tacitly understood by members of the communities in question. In bioinformatics, the 'parent' disciplines have different value systems; for example, what is considered worthwhile research by computer scientists can be thought of as trivial by biologists, and vice versa . This paper concentrates on the problems of reward and recognition described by scientists working in academic bioinformatics in the United Kingdom. We highlight problems that are a consequence of its cross-cultural make-up, recognising that the mismatches in knowledge in this borderland take place not just at the level of the practical, theoretical, or epistemological, but also at the cultural level too. The trend in big, interdisciplinary science is towards multiple authors on a single paper; in bioinformatics this has created hybrid or fractional scientists who find they are being positioned not just in-between established disciplines but also in-between as middle authors or, worse still, left off papers altogether.

  4. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  5. Fundamental Stellar Properties of M-Dwarfs from the CHARA Array

    NASA Astrophysics Data System (ADS)

    Berger, D. H.; Gies, D. R.; McAlister, H. A.; ten Brummelaar, T. A.; Henry, T. J.; Sturmann, J.; Sturmann, L.; Turner, N. H.; Ridgway, S. T.; Aufdenberg, J. P.; Mérand, A. M.

    2005-12-01

    We report the angular diameters of six M dwarfs ranging in spectral type from M1.0 V to M3.0 V measured with Georgia State University's CHARA Array, a long-baseline optical interferometer located at Mount Wilson Observatory. Observations were made with the longest baselines in the near infrared K'-band and yielded angular diameters less than one milliarcsecond. Using an iterative process combining parallaxes from the NStars program and photometrically-derived bolometric luminosities and masses, we calculated effective temperatures, surface gravities, and stellar radii. Our results are consistent with other empirical measurements of M-dwarf radii, but found that current models underestimate the true stellar radii by up to 15-20%. We suggest that theoretical models for low mass stars may be lacking an opacity source that alters the computed stellar radii. Science operations at the Array are supported by the National Science Foundation through NSF Grant AST--0307562 and by Georgia State University through the College of Arts and Sciences and the Office of the Vice President for Research. Financial support for DHB was provided by the National Science Foundation through grant AST--0205297.

  6. Quantifying the chiral magnetic effect from anomalous-viscous fluid dynamics

    NASA Astrophysics Data System (ADS)

    Jiang, Yin; Shi, Shuzhe; Yin, Yi; Liao, Jinfeng

    2018-01-01

    The Chiral Magnetic Effect (CME) is a macroscopic manifestation of fundamental chiral anomaly in a many-body system of chiral fermions, and emerges as an anomalous transport current in the fluid dynamics framework. Experimental observation of the CME is of great interest and has been reported in Dirac and Weyl semimetals. Significant efforts have also been made to look for the CME in heavy ion collisions. Critically needed for such a search is the theoretical prediction for the CME signal. In this paper we report a first quantitative modeling framework, Anomalous Viscous Fluid Dynamics (AVFD), which computes the evolution of fermion currents on top of realistic bulk evolution in heavy ion collisions and simultaneously accounts for both anomalous and normal viscous transport effects. AVFD allows a quantitative understanding of the generation and evolution of CME-induced charge separation during the hydrodynamic stage, as well as its dependence on theoretical ingredients. With reasonable estimates of key parameters, the AVFD simulations provide the first phenomenologically successful explanation of the measured signal in 200 AGeV AuAu collisions. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, within the framework of the Beam Energy Scan Theory (BEST) Topical Collaboration. The work is also supported in part by the National Science Foundation under Grant No. PHY-1352368 (SS and JL), by the National Science Foundation of China under Grant No. 11735007 (JL) and by the U.S. Department of Energy under grant Contract Number No. DE- SC0012704 (BNL)/DE-SC0011090 (MIT) (YY). JL is grateful to the Institute for Nuclear Theory for hospitality during the INT-16-3 Program. The computation of this research was performed on IU’s Big Red II cluster, supported in part by Lilly Endowment, Inc. (through its support for the Indiana University Pervasive Technology Institute) and in part by the Indiana METACyt Initiative.

  7. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models

    DOE PAGES

    Rao, Nageswara S. V.; Poole, Stephen W.; Ma, Chris Y. T.; ...

    2015-04-06

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical sub-infrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein theirmore » components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. In conclusion, the analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures.« less

  8. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S. V.; Poole, Stephen W.; Ma, Chris Y. T.

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical sub-infrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein theirmore » components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. In conclusion, the analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures.« less

  9. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models.

    PubMed

    Rao, Nageswara S V; Poole, Stephen W; Ma, Chris Y T; He, Fei; Zhuang, Jun; Yau, David K Y

    2016-04-01

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities, expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical subinfrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein their components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures, are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. The analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures. © 2015 Society for Risk Analysis.

  10. A striking profile: Soil ecological knowledge in restoration management and science

    Treesearch

    Mac A. Callaham; Charles C. Rhoades; Liam Heneghan

    2008-01-01

    Available evidence suggests that research in terrestrial restoration ecology has been dominated by the engineering and botanical sciences. Because restoration science is a relatively young discipline in ecology, the theoretical framework for this discipline is under development and new theoretical offerings appear regularly in the literature. In reviewing this...

  11. Technology in Support of Argument Construction in School Science

    ERIC Educational Resources Information Center

    Evagorou, Maria; Avraamidou, Lucy

    2008-01-01

    In this theoretical article the authors discuss the role of technology tools in supporting students' argument construction within the context of middle and high school science. In the first part of the article they focus on the theoretical underpinnings for studying argumentation in school science and report on the difficulties associated with…

  12. Science in the Eyes of Preschool Children: Findings from an Innovative Research Tool

    NASA Astrophysics Data System (ADS)

    Dubosarsky, Mia D.

    How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.

  13. Role of Laboratory Plasma Experiments in exploring the Physics of Solar Eruptions

    NASA Astrophysics Data System (ADS)

    Tripathi, S.

    2017-12-01

    Solar eruptive events are triggered over a broad range of spatio-temporal scales by a variety of fundamental processes (e.g., force-imbalance, magnetic-reconnection, electrical-current driven instabilities) associated with arched magnetoplasma structures in the solar atmosphere. Contemporary research on solar eruptive events is at the forefront of solar and heliospheric physics due to its relevance to space weather. Details on the formation of magnetized plasma structures on the Sun, storage of magnetic energy in such structures over a long period (several Alfven transit times), and their impulsive eruptions have been recorded in numerous observations and simulated in computer models. Inherent limitations of space observations and uncontrolled nature of solar eruptions pose significant challenges in testing theoretical models and developing the predictive capability for space-weather. The pace of scientific progress in this area can be significantly boosted by tapping the potential of appropriately scaled laboratory plasma experiments to compliment solar observations, theoretical models, and computer simulations. To give an example, recent results from a laboratory plasma experiment on arched magnetic flux ropes will be presented and future challenges will be discussed. (Work supported by National Science Foundation, USA under award number 1619551)

  14. Theoretical Heterogeneous Catalysis: Scaling Relationships and Computational Catalyst Design.

    PubMed

    Greeley, Jeffrey

    2016-06-07

    Scaling relationships are theoretical constructs that relate the binding energies of a wide variety of catalytic intermediates across a range of catalyst surfaces. Such relationships are ultimately derived from bond order conservation principles that were first introduced several decades ago. Through the growing power of computational surface science and catalysis, these concepts and their applications have recently begun to have a major impact in studies of catalytic reactivity and heterogeneous catalyst design. In this review, the detailed theory behind scaling relationships is discussed, and the existence of these relationships for catalytic materials ranging from pure metal to oxide surfaces, for numerous classes of molecules, and for a variety of catalytic surface structures is described. The use of the relationships to understand and elucidate reactivity trends across wide classes of catalytic surfaces and, in some cases, to predict optimal catalysts for certain chemical reactions, is explored. Finally, the observation that, in spite of the tremendous power of scaling relationships, their very existence places limits on the maximum rates that may be obtained for the catalyst classes in question is discussed, and promising strategies are explored to overcome these limitations to usher in a new era of theory-driven catalyst design.

  15. Inverse design of an isotropic suspended Kirchhoff rod: theoretical and numerical results on the uniqueness of the natural shape.

    PubMed

    Bertails-Descoubes, Florence; Derouet-Jourdan, Alexandre; Romero, Victor; Lazarus, Arnaud

    2018-04-01

    Solving the equations for Kirchhoff elastic rods has been widely explored for decades in mathematics, physics and computer science, with significant applications in the modelling of thin flexible structures such as DNA, hair or climbing plants. As demonstrated in previous experimental and theoretical studies, the natural curvature plays an important role in the equilibrium shape of a Kirchhoff rod, even in the simple case where the rod is isotropic and suspended under gravity. In this paper, we investigate the reverse problem: can we characterize the natural curvature of a suspended isotropic rod, given an equilibrium curve? We prove that although there exists an infinite number of natural curvatures that are compatible with the prescribed equilibrium, they are all equivalent in the sense that they correspond to a unique natural shape for the rod. This natural shape can be computed efficiently by solving in sequence three linear initial value problems, starting from any framing of the input curve. We provide several numerical experiments to illustrate this uniqueness result, and finally discuss its potential impact on non-invasive parameter estimation and inverse design of thin elastic rods.

  16. Inverse design of an isotropic suspended Kirchhoff rod: theoretical and numerical results on the uniqueness of the natural shape

    NASA Astrophysics Data System (ADS)

    Bertails-Descoubes, Florence; Derouet-Jourdan, Alexandre; Romero, Victor; Lazarus, Arnaud

    2018-04-01

    Solving the equations for Kirchhoff elastic rods has been widely explored for decades in mathematics, physics and computer science, with significant applications in the modelling of thin flexible structures such as DNA, hair or climbing plants. As demonstrated in previous experimental and theoretical studies, the natural curvature plays an important role in the equilibrium shape of a Kirchhoff rod, even in the simple case where the rod is isotropic and suspended under gravity. In this paper, we investigate the reverse problem: can we characterize the natural curvature of a suspended isotropic rod, given an equilibrium curve? We prove that although there exists an infinite number of natural curvatures that are compatible with the prescribed equilibrium, they are all equivalent in the sense that they correspond to a unique natural shape for the rod. This natural shape can be computed efficiently by solving in sequence three linear initial value problems, starting from any framing of the input curve. We provide several numerical experiments to illustrate this uniqueness result, and finally discuss its potential impact on non-invasive parameter estimation and inverse design of thin elastic rods.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    van der Eide, Edwin F.; Yang, Ping; Walter, Eric D.

    Unlike the very labile, unobservable radical cations [{l_brace}CpM(CO){sub 3}{r_brace}{sub 2}]{sup {sm_bullet}+} (M = W, Mo), derivatives [{l_brace}CpM(CO){sub 2}(PMe{sub 3}){r_brace}{sub 2}]{sup {sm_bullet}+} are stable enough to be isolated and characterized. Experimental and theoretical studies show that the shortened M-M bonds are of order 1 1/2, and that they are not supported by bridging ligands. The unpaired electron is fully delocalized, with a spin density of ca. 45% on each metal atom. We thank the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Biosciences and Geosciences for support of this work. Pacific Northwestmore » National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. The EPR and computational studies were performed using EMSL, a national scientific user facility sponsored by the DOE's Office of Biological and Environmental Research and located at PNNL. We thank Dr. Charles Windisch for access to his UV-Vis-NIR spectrometer.« less

  18. Carbon-doping-induced negative differential resistance in armchair phosphorene nanoribbons

    NASA Astrophysics Data System (ADS)

    Guo, Caixia; Xia, Congxin; Wang, Tianxing; Liu, Yufang

    2017-03-01

    By using a combined method of density functional theory and non-equilibrium Green’s function formalism, we investigate the electronic transport properties of carbon-doped armchair phosphorene nanoribbons (APNRs). The results show that C atom doping can strongly affect the electronic transport properties of the APNR and change it from semiconductor to metal. Meanwhile, obvious negative differential resistance (NDR) behaviors are obtained by tuning the doping position and concentration. In particular, with reducing doping concentration, NDR peak position can enter into mV bias range. These results provide a theoretical support to design the related nanodevice by tuning the doping position and concentration in the APNRs. Project supported by the National Natural Science Foundation of China (No. 11274096), the University Science and Technology Innovation Team Support Project of Henan Province (No. 13IRTSTHN016), the University key Science Research Project of Henan Province (No.16A140043). The calculation about this work was supported by the High Performance Computing Center of Henan Normal University.

  19. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    NASA Astrophysics Data System (ADS)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  20. Theory of Remote Image Formation

    NASA Astrophysics Data System (ADS)

    Blahut, Richard E.

    2004-11-01

    In many applications, images, such as ultrasonic or X-ray signals, are recorded and then analyzed with digital or optical processors in order to extract information. Such processing requires the development of algorithms of great precision and sophistication. This book presents a unified treatment of the mathematical methods that underpin the various algorithms used in remote image formation. The author begins with a review of transform and filter theory. He then discusses two- and three-dimensional Fourier transform theory, the ambiguity function, image construction and reconstruction, tomography, baseband surveillance systems, and passive systems (where the signal source might be an earthquake or a galaxy). Information-theoretic methods in image formation are also covered, as are phase errors and phase noise. Throughout the book, practical applications illustrate theoretical concepts, and there are many homework problems. The book is aimed at graduate students of electrical engineering and computer science, and practitioners in industry. Presents a unified treatment of the mathematical methods that underpin the algorithms used in remote image formation Illustrates theoretical concepts with reference to practical applications Provides insights into the design parameters of real systems

  1. Multi-scale Modeling of Chromosomal DNA in Living Cells

    NASA Astrophysics Data System (ADS)

    Spakowitz, Andrew

    The organization and dynamics of chromosomal DNA play a pivotal role in a range of biological processes, including gene regulation, homologous recombination, replication, and segregation. Establishing a quantitative theoretical model of DNA organization and dynamics would be valuable in bridging the gap between the molecular-level packaging of DNA and genome-scale chromosomal processes. Our research group utilizes analytical theory and computational modeling to establish a predictive theoretical model of chromosomal organization and dynamics. In this talk, I will discuss our efforts to develop multi-scale polymer models of chromosomal DNA that are both sufficiently detailed to address specific protein-DNA interactions while capturing experimentally relevant time and length scales. I will demonstrate how these modeling efforts are capable of quantitatively capturing aspects of behavior of chromosomal DNA in both prokaryotic and eukaryotic cells. This talk will illustrate that capturing dynamical behavior of chromosomal DNA at various length scales necessitates a range of theoretical treatments that accommodate the critical physical contributions that are relevant to in vivo behavior at these disparate length and time scales. National Science Foundation, Physics of Living Systems Program (PHY-1305516).

  2. Emergence, Agency, and Interaction-Notes from the Field.

    PubMed

    Penny, Simon

    2015-01-01

    This article describes the development of several interactive installations and robotic artworks developed through the 1990s and the technological, theoretical, and discursive context in which those works arose. The main works discussed are Petit Mal (1989-1995), Sympathetic Sentience (1996-1997), Fugitive I (1996-1997), Traces (1998-1999), and Fugitive II (2001-2004)-full documentation at ( www.simonpenny.net/works ). These works were motivated by a critical analysis of cognitivist computer science, which contrasted with notions of embodied experience arising from the arts. The works address questions of agency and interaction, informed by cybernetics and artificial life.

  3. Exploring the Relationships between Self-Efficacy and Preference for Teacher Authority among Computer Science Majors

    ERIC Educational Resources Information Center

    Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2013-01-01

    Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…

  4. PREFACE: Quantum Information, Communication, Computation and Cryptography

    NASA Astrophysics Data System (ADS)

    Benatti, F.; Fannes, M.; Floreanini, R.; Petritis, D.

    2007-07-01

    The application of quantum mechanics to information related fields such as communication, computation and cryptography is a fast growing line of research that has been witnessing an outburst of theoretical and experimental results, with possible practical applications. On the one hand, quantum cryptography with its impact on secrecy of transmission is having its first important actual implementations; on the other hand, the recent advances in quantum optics, ion trapping, BEC manipulation, spin and quantum dot technologies allow us to put to direct test a great deal of theoretical ideas and results. These achievements have stimulated a reborn interest in various aspects of quantum mechanics, creating a unique interplay between physics, both theoretical and experimental, mathematics, information theory and computer science. In view of all these developments, it appeared timely to organize a meeting where graduate students and young researchers could be exposed to the fundamentals of the theory, while senior experts could exchange their latest results. The activity was structured as a school followed by a workshop, and took place at The Abdus Salam International Center for Theoretical Physics (ICTP) and The International School for Advanced Studies (SISSA) in Trieste, Italy, from 12-23 June 2006. The meeting was part of the activity of the Joint European Master Curriculum Development Programme in Quantum Information, Communication, Cryptography and Computation, involving the Universities of Cergy-Pontoise (France), Chania (Greece), Leuven (Belgium), Rennes1 (France) and Trieste (Italy). This special issue of Journal of Physics A: Mathematical and Theoretical collects 22 contributions from well known experts who took part in the workshop. They summarize the present day status of the research in the manifold aspects of quantum information. The issue is opened by two review articles, the first by G Adesso and F Illuminati discussing entanglement in continuous variable systems, the second by T Prosen, discussing chaos and complexity in quantum systems. Both topics have theoretical as well as experimental relevance and are likely to witness a fast growing development in the near future. The remaining contributions present more specific and very recent results. They involve the study of the structure of quantum states and their estimation (B Baumgartner et al, C King et al, S Olivares et al, D Petz et al and W van Dam et al), of entanglement generation and its quantification (G Brida et al, F Ciccarello et al, G Costantini et al, O Romero-Isart et al, D Rossini et al, A Serafini et al and D Vitali et al), of randomness related effects on entanglement behaviour (I Akhalwaya et al, O Dahlsten et al and L Viola et al), and of abstract and applied aspects of quantum computation and communication (K Audenart, G M D'Ariano et al, N Datta et al, L C Kwek et al and M Nathanson et al). We would like to express our gratitude to the European Commission, the Abdus Salam ICTP, SISSA and Eurotech SpA (Amaro, Udine, Italy) for financial and/or logistic support. Special thanks also go to the workshop secretary Marina De Comelli, and the secretaries of the Department of Theoretical Physics, University of Trieste, Sabrina Gaspardis and Rosita Glavina for their precious help and assistance.

  5. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    NASA Astrophysics Data System (ADS)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  6. Evaluating the Efficacy of the Cloud for Cluster Computation

    NASA Technical Reports Server (NTRS)

    Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom

    2012-01-01

    Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.

  7. ICESat Science Investigator led Processing System (I-SIPS)

    NASA Astrophysics Data System (ADS)

    Bhardwaj, S.; Bay, J.; Brenner, A.; Dimarzio, J.; Hancock, D.; Sherman, M.

    2003-12-01

    The ICESat Science Investigator-led Processing System (I-SIPS) generates the GLAS standard data products. It consists of two main parts the Scheduling and Data Management System (SDMS) and the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software. The system has been operational since the successful launch of ICESat. It ingests data from the GLAS instrument, generates GLAS data products, and distributes them to the GLAS Science Computing Facility (SCF), the Instrument Support Facility (ISF) and the National Snow and Ice Data Center (NSIDC) ECS DAAC. The SDMS is the Planning, Scheduling and Data Management System that runs the GLAS Science Algorithm Software (GSAS). GSAS is based on the Algorithm Theoretical Basis Documents provided by the Science Team and is developed independently of SDMS. The SDMS provides the processing environment to plan jobs based on existing data, control job flow, data distribution, and archiving. The SDMS design is based on a mission-independent architecture that imposes few constraints on the science code thereby facilitating I-SIPS integration. I-SIPS currently works in an autonomous manner to ingest GLAS instrument data, distribute this data to the ISF, run the science processing algorithms to produce the GLAS standard products, reprocess data when new versions of science algorithms are released, and distributes the products to the SCF, ISF, and NSIDC. I-SIPS has a proven performance record, delivering the data to the SCF within hours after the initial instrument activation. The I-SIPS design philosophy gives this system a high potential for reuse in other science missions.

  8. Mundane science use in a practice theoretical perspective: Different understandings of the relations between citizen-consumers and public communication initiatives build on scientific claims.

    PubMed

    Halkier, Bente

    2015-08-13

    Public communication initiatives play a part in placing complicated scientific claims in citizen-consumers' everyday contexts. Lay reactions to scientific claims framed in public communication, and attempts to engage citizens, have been important subjects of discussion in the literatures of public understanding and public engagement with science. Many of the public communication initiatives, however, address lay people as consumers rather than citizens. This creates specific challenges for understanding public engagement with science and scientific citizenship. The article compares five different understandings of the relations between citizen-consumers and public issue communication involving science, where the first four types are widely represented in the Public Understanding of Science discussions. The fifth understanding is a practice theoretical perspective. The article suggests how the public understanding of and engagement in science literature can benefit from including a practice theoretical approach to research about mundane science use and public engagement. © The Author(s) 2015.

  9. Theoretical Study of White Dwarf Double Stars

    NASA Astrophysics Data System (ADS)

    Hira, Ajit; Koetter, Ted; Rivera, Ruben; Diaz, Juan

    2015-04-01

    We continue our interest in the computational simulation of the astrophysical phenomena with a study of gravitationally-bound binary stars, composed of at least one white dwarf star. Of particular interest to astrophysicists are the conditions inside a white dwarf star in the time frame leading up to its explosive end as a Type Ia supernova, for an understanding of the massive stellar explosions. In addition, the studies of the evolution of white dwarfs could serve as promising probes of theories of gravitation. We developed FORTRAN computer programs to implement our models for white dwarfs and other stars. These codes allow for different sizes and masses of stars. Simulations were done in the mass interval from 0.1 to 2.0 solar masses. Our goal was to obtain both atmospheric and orbital parameters. The computational results thus obtained are compared with relevant observational data. The data are further analyzed to identify trends in terms of sizes and masses of stars. We hope to extend our computational studies to blue giant stars in the future. Research Supported by National Science Foundation.

  10. Ab initio calculations for industrial materials engineering: successes and challenges.

    PubMed

    Wimmer, Erich; Najafabadi, Reza; Young, George A; Ballard, Jake D; Angeliu, Thomas M; Vollmer, James; Chambers, James J; Niimi, Hiroaki; Shaw, Judy B; Freeman, Clive; Christensen, Mikael; Wolf, Walter; Saxe, Paul

    2010-09-29

    Computational materials science based on ab initio calculations has become an important partner to experiment. This is demonstrated here for the effect of impurities and alloying elements on the strength of a Zr twist grain boundary, the dissociative adsorption and diffusion of iodine on a zirconium surface, the diffusion of oxygen atoms in a Ni twist grain boundary and in bulk Ni, and the dependence of the work function of a TiN-HfO(2) junction on the replacement of N by O atoms. In all of these cases, computations provide atomic-scale understanding as well as quantitative materials property data of value to industrial research and development. There are two key challenges in applying ab initio calculations, namely a higher accuracy in the electronic energy and the efficient exploration of large parts of the configurational space. While progress in these areas is fueled by advances in computer hardware, innovative theoretical concepts combined with systematic large-scale computations will be needed to realize the full potential of ab initio calculations for industrial applications.

  11. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  12. On the Correct Analysis of the Foundations of Theoretical Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2007-04-01

    The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.

  13. Tactics for mechanized reasoning: a commentary on Milner (1984) ‘The use of machines to assist in rigorous proof’

    PubMed Central

    Gordon, M. J. C.

    2015-01-01

    Robin Milner's paper, ‘The use of machines to assist in rigorous proof’, introduces methods for automating mathematical reasoning that are a milestone in the development of computer-assisted theorem proving. His ideas, particularly his theory of tactics, revolutionized the architecture of proof assistants. His methodology for automating rigorous proof soundly, particularly his theory of type polymorphism in programing, led to major contributions to the theory and design of programing languages. His citation for the 1991 ACM A.M. Turing award, the most prestigious award in computer science, credits him with, among other achievements, ‘probably the first theoretically based yet practical tool for machine assisted proof construction’. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750147

  14. Investigation of noise properties in grating-based x-ray phase tomography with reverse projection method

    NASA Astrophysics Data System (ADS)

    Bao, Yuan; Wang, Yan; Gao, Kun; Wang, Zhi-Li; Zhu, Pei-Ping; Wu, Zi-Yu

    2015-10-01

    The relationship between noise variance and spatial resolution in grating-based x-ray phase computed tomography (PCT) imaging is investigated with reverse projection extraction method, and the noise variances of the reconstructed absorption coefficient and refractive index decrement are compared. For the differential phase contrast method, the noise variance in the differential projection images follows the same inverse-square law with spatial resolution as in conventional absorption-based x-ray imaging projections. However, both theoretical analysis and simulations demonstrate that in PCT the noise variance of the reconstructed refractive index decrement scales with spatial resolution follows an inverse linear relationship at fixed slice thickness, while the noise variance of the reconstructed absorption coefficient conforms with the inverse cubic law. The results indicate that, for the same noise variance level, PCT imaging may enable higher spatial resolution than conventional absorption computed tomography (ACT), while ACT benefits more from degraded spatial resolution. This could be a useful guidance in imaging the inner structure of the sample in higher spatial resolution. Project supported by the National Basic Research Program of China (Grant No. 2012CB825800), the Science Fund for Creative Research Groups, the Knowledge Innovation Program of the Chinese Academy of Sciences (Grant Nos. KJCX2-YW-N42 and Y4545320Y2), the National Natural Science Foundation of China (Grant Nos. 11475170, 11205157, 11305173, 11205189, 11375225, 11321503, 11179004, and U1332109).

  15. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    NASA Astrophysics Data System (ADS)

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-06-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning, spiral learning and peer assessment. Namely, the course is articulated during a semester through the structured (progressive and incremental) development of a sequence of four projects, whose duration, scope and difficulty of management increase as the student gains theoretical and instrumental knowledge related to planning, monitoring and controlling projects. Moreover, the proposal is complemented using peer assessment. The proposal has already been implemented and validated for the last 3 years in two different universities. In the first year, project-based learning and spiral learning methods were combined. Such a combination was also employed in the other 2 years; but additionally, students had the opportunity to assess projects developed by university partners and by students of the other university. A total of 154 students have participated in the study. We obtain a gain in the quality of the subsequently projects derived from the spiral project-based learning. Moreover, this gain is significantly bigger when peer assessment is introduced. In addition, high-performance students take advantage of peer assessment from the first moment, whereas the improvement in poor-performance students is delayed.

  16. Broadening the Study of Participation in the Life Sciences: How Critical Theoretical and Mixed-Methodological Approaches Can Enhance Efforts to Broaden Participation

    ERIC Educational Resources Information Center

    Metcalf, Heather

    2016-01-01

    This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these…

  17. Computer-Game Construction: A Gender-Neutral Attractor to Computing Science

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan

    2010-01-01

    Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…

  18. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less

  19. Propagation and Interaction of Edge Dislocation (Kink) in the Square Lattice

    NASA Astrophysics Data System (ADS)

    Jia, Li-Ping; Jasmina, T´; Duan, Wen-Shan

    2015-04-01

    Not Available Supported by the National Magnetic Confinement Fusion Science Program of China under Grant No 2014GB104002, the Strategic Priority Research Program of Chinese Academy of Sciences under Grant No XDA03030100, the National Natural Science Foundation of China under Grant Nos 11275156 and 11304324, the Open Project Program of State Key Laboratory of Theoretical Physics of Institute of Theoretical Physics of Chinese Academy of Sciences under Grant No Y4KF201CJ1, and the Serbian Ministry of Education and Science under Grant No III-45010.

  20. Science and Science Fiction

    ScienceCinema

    Scherrer, Robert [Vanderbilt University, Nashville, Tennessee, United States

    2017-12-09

    I will explore the similarities and differences between the process of writing science fiction and the process of 'producing' science, specifically theoretical physics. What are the ground rules for introducing unproven new ideas in science fiction, and how do they differ from the corresponding rules in physics? How predictive is science fiction? (For that matter, how predictive is theoretical physics?) I will also contrast the way in which information is presented in science fiction, as opposed to its presentation in scientific papers, and I will examine the relative importance of ideas (as opposed to the importance of the way in which these ideas are presented). Finally, I will discuss whether a background as a research scientist provides any advantage in writing science fiction.

  1. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  2. Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!

    NASA Astrophysics Data System (ADS)

    Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.

    2015-04-01

    Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.

  3. Eclecticism as the foundation of meta-theoretical, mixed methods and interdisciplinary research in social sciences.

    PubMed

    Kroos, Karmo

    2012-03-01

    This article examines the value of "eclecticism" as the foundation of meta-theoretical, mixed methods and interdisciplinary research in social sciences. On the basis of the analysis of the historical background of the concept, it is first suggested that eclecticism-based theoretical scholarship in social sciences could benefit from the more systematic research method that has been developed for synthesizing theoretical works under the name metatheorizing. Second, it is suggested that the mixed methods community could base its research approach on philosophical eclecticism instead of pragmatism because the basic idea of eclecticism is much more in sync with the nature of the combined research tradition. Finally, the Kuhnian frame is used to support the argument for interdisciplinary research and, hence, eclecticism in social sciences (rather than making an argument against multiple paradigms). More particularly, it is suggested that integrating the different (inter)disciplinary traditions and schools into one is not necessarily desirable at all in social sciences because of the complexity and openness of the research field. If it is nevertheless attempted, experience in economics suggests that paradigmatic unification comes at a high price.

  4. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  5. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  6. Blind quantum computing with weak coherent pulses.

    PubMed

    Dunjko, Vedran; Kashefi, Elham; Leverrier, Anthony

    2012-05-18

    The universal blind quantum computation (UBQC) protocol [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual IEEE Symposiumon Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, USA, 2009), pp. 517-526.] allows a client to perform quantum computation on a remote server. In an ideal setting, perfect privacy is guaranteed if the client is capable of producing specific, randomly chosen single qubit states. While from a theoretical point of view, this may constitute the lowest possible quantum requirement, from a pragmatic point of view, generation of such states to be sent along long distances can never be achieved perfectly. We introduce the concept of ϵ blindness for UBQC, in analogy to the concept of ϵ security developed for other cryptographic protocols, allowing us to characterize the robustness and security properties of the protocol under possible imperfections. We also present a remote blind single qubit preparation protocol with weak coherent pulses for the client to prepare, in a delegated fashion, quantum states arbitrarily close to perfect random single qubit states. This allows us to efficiently achieve ϵ-blind UBQC for any ϵ>0, even if the channel between the client and the server is arbitrarily lossy.

  7. Blind Quantum Computing with Weak Coherent Pulses

    NASA Astrophysics Data System (ADS)

    Dunjko, Vedran; Kashefi, Elham; Leverrier, Anthony

    2012-05-01

    The universal blind quantum computation (UBQC) protocol [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual IEEE Symposiumon Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, USA, 2009), pp. 517-526.] allows a client to perform quantum computation on a remote server. In an ideal setting, perfect privacy is guaranteed if the client is capable of producing specific, randomly chosen single qubit states. While from a theoretical point of view, this may constitute the lowest possible quantum requirement, from a pragmatic point of view, generation of such states to be sent along long distances can never be achieved perfectly. We introduce the concept of ɛ blindness for UBQC, in analogy to the concept of ɛ security developed for other cryptographic protocols, allowing us to characterize the robustness and security properties of the protocol under possible imperfections. We also present a remote blind single qubit preparation protocol with weak coherent pulses for the client to prepare, in a delegated fashion, quantum states arbitrarily close to perfect random single qubit states. This allows us to efficiently achieve ɛ-blind UBQC for any ɛ>0, even if the channel between the client and the server is arbitrarily lossy.

  8. Data-driven discovery of new Dirac semimetal materials

    NASA Astrophysics Data System (ADS)

    Yan, Qimin; Chen, Ru; Neaton, Jeffrey

    In recent years, a significant amount of materials property data from high-throughput computations based on density functional theory (DFT) and the application of database technologies have enabled the rise of data-driven materials discovery. In this work, we initiate the extension of the data-driven materials discovery framework to the realm of topological semimetal materials and to accelerate the discovery of novel Dirac semimetals. We implement current available and develop new workflows to data-mine the Materials Project database for novel Dirac semimetals with desirable band structures and symmetry protected topological properties. This data-driven effort relies on the successful development of several automatic data generation and analysis tools, including a workflow for the automatic identification of topological invariants and pattern recognition techniques to find specific features in a massive number of computed band structures. Utilizing this approach, we successfully identified more than 15 novel Dirac point and Dirac nodal line systems that have not been theoretically predicted or experimentally identified. This work is supported by the Materials Project Predictive Modeling Center through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231.

  9. "Bulk" Nanocrystalline Metals: Review of the Current State of the Art and Future Opportunities for Copper and Copper Alloys

    NASA Astrophysics Data System (ADS)

    Tschopp, M. A.; Murdoch, H. A.; Kecskes, L. J.; Darling, K. A.

    2014-06-01

    It is a new beginning for innovative fundamental and applied science in nanocrystalline materials. Many of the processing and consolidation challenges that have haunted nanocrystalline materials are now more fully understood, opening the doors for bulk nanocrystalline materials and parts to be produced. While challenges remain, recent advances in experimental, computational, and theoretical capability have allowed for bulk specimens that have heretofore been pursued only on a limited basis. This article discusses the methodology for synthesis and consolidation of bulk nanocrystalline materials using mechanical alloying, the alloy development and synthesis process for stabilizing these materials at elevated temperatures, and the physical and mechanical properties of nanocrystalline materials with a focus throughout on nanocrystalline copper and a nanocrystalline Cu-Ta system, consolidated via equal channel angular extrusion, with properties rivaling that of nanocrystalline pure Ta. Moreover, modeling and simulation approaches as well as experimental results for grain growth, grain boundary processes, and deformation mechanisms in nanocrystalline copper are briefly reviewed and discussed. Integrating experiments and computational materials science for synthesizing bulk nanocrystalline materials can bring about the next generation of ultrahigh strength materials for defense and energy applications.

  10. PREFACE: 3rd International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE 2014)

    NASA Astrophysics Data System (ADS)

    2015-01-01

    The third International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place at Madrid, Spain, from Thursday 28 to Sunday 31 August 2014. The Conference was attended by more than 200 participants and hosted about 350 oral, poster, and virtual presentations. More than 600 pre-registered authors were also counted. The third IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics etc. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, three parallel oral sessions and one poster session were running every day. However, according to all attendees, the program was excellent with high level of talks and the scientific environment was fruitful, thus all attendees had a creative time. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee.

  11. PREFACE: 4th International Conference on Mathematical Modeling in Physical Sciences (IC-MSquare2015)

    NASA Astrophysics Data System (ADS)

    Vlachos, Dimitrios; Vagenas, Elias C.

    2015-09-01

    The 4th International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place in Mykonos, Greece, from Friday 5th June to Monday 8th June 2015. The Conference was attended by more than 150 participants and hosted about 200 oral, poster, and virtual presentations. There were more than 600 pre-registered authors. The 4th IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics etc. The scientific program was rather intense as after the Keynote and Invited Talks in the morning, three parallel oral and one poster session were running every day. However, according to all attendees, the program was excellent with a high quality of talks creating an innovative and productive scientific environment for all attendees. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee.

  12. A Financial Technology Entrepreneurship Program for Computer Science Students

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2011-01-01

    Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…

  13. Computer Science Teacher Professional Development in the United States: A Review of Studies Published between 2004 and 2014

    ERIC Educational Resources Information Center

    Menekse, Muhsin

    2015-01-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…

  14. Theoretical Approaches to Political Communication.

    ERIC Educational Resources Information Center

    Chesebro, James W.

    Political communication appears to be emerging as a theoretical and methodological academic area of research within both speech-communication and political science. Five complimentary approaches to political science (Machiavellian, iconic, ritualistic, confirmational, and dramatistic) may be viewed as a series of variations which emphasize the…

  15. PandExo: A Community Tool for Transiting Exoplanet Science with JWST & HST

    NASA Astrophysics Data System (ADS)

    Batalha, Natasha E.; Mandell, Avi; Pontoppidan, Klaus; Stevenson, Kevin B.; Lewis, Nikole K.; Kalirai, Jason; Earl, Nick; Greene, Thomas; Albert, Loïc; Nielsen, Louise D.

    2017-06-01

    As we approach the James Webb Space Telescope (JWST) era, several studies have emerged that aim to (1) characterize how the instruments will perform and (2) determine what atmospheric spectral features could theoretically be detected using transmission and emission spectroscopy. To some degree, all these studies have relied on modeling of JWST’s theoretical instrument noise. With under two years left until launch, it is imperative that the exoplanet community begins to digest and integrate these studies into their observing plans, as well as think about how to leverage the Hubble Space Telescope (HST) to optimize JWST observations. To encourage this and to allow all members of the community access to JWST & HST noise simulations, we present here an open-source Python package and online interface for creating observation simulations of all observatory-supported timeseries spectroscopy modes. This noise simulator, called PandExo, relies on some aspects of Space Telescope Science Institute’s Exposure Time Calculator, Pandeia. We describe PandExo and the formalism for computing noise sources for JWST. Then we benchmark PandExo's performance against each instrument team’s independently written noise simulator for JWST, and previous observations for HST. We find that PandExo is within 10% agreement for HST/WFC3 and for all JWST instruments.

  16. Designing for students' science learning using argumentation and classroom debate

    NASA Astrophysics Data System (ADS)

    Bell, Philip Laverne

    1998-12-01

    This research investigates how to design and introduce an educational innovation into a classroom setting to support learning. The research yields cognitive design principles for instruction involving scientific argumentation and debate. Specifically, eighth-grade students used a computer learning environment to construct scientific arguments and to participate in a classroom debate. The instruction was designed to help students integrate their science understanding by debating: How far does light go, does light die out over distance or go forever until absorbed? This research explores the tension between focusing students' conceptual change on specific scientific phenomena and their development of integrated understanding. I focus on the importance of connecting students' everyday experiences and intuitions to their science learning. The work reported here characterizes how students see the world through a filter of their own understanding. It explores how individual and social mechanisms in instruction support students as they expand the range of ideas under consideration and distinguish between these ideas using scientific criteria. Instruction supported students as they engaged in argumentation and debate on a set of multimedia evidence items from the World-Wide-Web. An argument editor called SenseMaker was designed and studied with the intent of making individual and group thinking visible during instruction. Over multiple classroom trials, different student cohorts were increasingly supported in scientific argumentation involving systematic coordination of evidence with theoretical ideas about light. Students' knowledge representations were used as mediating "learning artifacts" during classroom debate. Two argumentation conditions were investigated. The Full Scope group prepared to defend either theoretical position in the debate. These students created arguments that included more theoretical conjectures and made more conceptual progress in understanding light. The Personal Scope group prepared to defend their original opinion about the debate. These students produced more acausal descriptions of evidence and theorized less in their arguments. Regardless of students' prior knowledge of light, the Full Scope condition resulted in a more integrated understanding. Results from the research were synthesized in design principles geared towards helping future designers. Sharing and refining cognitive design principles offers a productive focus for developing a design science for education.

  17. Computer Science | Classification | College of Engineering & Applied

    Science.gov Websites

    EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer

  18. Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?

    ERIC Educational Resources Information Center

    Schrock, John Richard

    1984-01-01

    Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…

  19. Toward benchmarking in catalysis science: Best practices, challenges, and opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bligaard, Thomas; Bullock, R. Morris; Campbell, Charles T.

    Benchmarking is a community-based and (preferably) community-driven activity involving consensus-based decisions on how to make reproducible, fair, and relevant assessments. In catalysis science, important catalyst performance metrics include activity, selectivity, and the deactivation profile, which enable comparisons between new and standard catalysts. Benchmarking also requires careful documentation, archiving, and sharing of methods and measurements, to ensure that the full value of research data can be realized. Beyond these goals, benchmarking presents unique opportunities to advance and accelerate understanding of complex reaction systems by combining and comparing experimental information from multiple, in situ and operando techniques with theoretical insights derived frommore » calculations characterizing model systems. This Perspective describes the origins and uses of benchmarking and its applications in computational catalysis, heterogeneous catalysis, molecular catalysis, and electrocatalysis. As a result, it also discusses opportunities and challenges for future developments in these fields.« less

  20. Hill's Heuristics and Explanatory Coherentism in Epidemiology.

    PubMed

    Dammann, Olaf

    2018-01-01

    In this essay, I argue that Ted Poston's theory of explanatory coherentism is well-suited as a tool for causal explanation in the health sciences, particularly in epidemiology. Coherence has not only played a role in epidemiology for more than half a century as one of Hill's viewpoints, it can also provide background theory for the development of explanatory systems by integrating epidemiologic evidence with a diversity of other error-independent data. I propose that computational formalization of Hill's viewpoints in an explanatory coherentist framework would provide an excellent starting point for a formal epistemological (knowledge-theoretical) project designed to improve causal explanation in the health sciences. As an example, I briefly introduce Paul Thagard's ECHO system and offer my responses to possible objections to my proposal. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Connections Matter: Social Networks and Lifespan Health in Primate Translational Models

    PubMed Central

    McCowan, Brenda; Beisner, Brianne; Bliss-Moreau, Eliza; Vandeleest, Jessica; Jin, Jian; Hannibal, Darcy; Hsieh, Fushing

    2016-01-01

    Humans live in societies full of rich and complex relationships that influence health. The ability to improve human health requires a detailed understanding of the complex interplay of biological systems that contribute to disease processes, including the mechanisms underlying the influence of social contexts on these biological systems. A longitudinal computational systems science approach provides methods uniquely suited to elucidate the mechanisms by which social systems influence health and well-being by investigating how they modulate the interplay among biological systems across the lifespan. In the present report, we argue that nonhuman primate social systems are sufficiently complex to serve as model systems allowing for the development and refinement of both analytical and theoretical frameworks linking social life to health. Ultimately, developing systems science frameworks in nonhuman primate models will speed discovery of the mechanisms that subserve the relationship between social life and human health. PMID:27148103

  2. High-fidelity plasma codes for burn physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooley, James; Graziani, Frank; Marinak, Marty

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental datamore » and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.« less

  3. Role of theory in space science

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The goal of theory is to understand how the fundamental laws of physics laws of physics and chemistry give rise to the features of the universe. It is recommended that NASA establish independent theoretical research programs in planetary sciences and in astrophysics similar to the solar-system plasma-physics theory program, which is characterized by stable, long-term support for theorists in university departments, NASA centers, and other organizations engaged in research in topics relevant to present and future space-derived data. It is recommended that NASA keep these programs under review to full benefit from the resulting research and to assure opportunities for inflow of new ideas and investigators. Also, provisions should be made by NASA for the computing needs of the theorists in the programs. Finally, it is recommended that NASA involve knowledgeable theorists in mission planning activities at all levels, from the formulation of long-term scientific strategies through the planning and operation of specific missions.

  4. Toward benchmarking in catalysis science: Best practices, challenges, and opportunities

    DOE PAGES

    Bligaard, Thomas; Bullock, R. Morris; Campbell, Charles T.; ...

    2016-03-07

    Benchmarking is a community-based and (preferably) community-driven activity involving consensus-based decisions on how to make reproducible, fair, and relevant assessments. In catalysis science, important catalyst performance metrics include activity, selectivity, and the deactivation profile, which enable comparisons between new and standard catalysts. Benchmarking also requires careful documentation, archiving, and sharing of methods and measurements, to ensure that the full value of research data can be realized. Beyond these goals, benchmarking presents unique opportunities to advance and accelerate understanding of complex reaction systems by combining and comparing experimental information from multiple, in situ and operando techniques with theoretical insights derived frommore » calculations characterizing model systems. This Perspective describes the origins and uses of benchmarking and its applications in computational catalysis, heterogeneous catalysis, molecular catalysis, and electrocatalysis. As a result, it also discusses opportunities and challenges for future developments in these fields.« less

  5. Final Report on DOE Project entitled Dynamic Optimized Advanced Scheduling of Bandwidth Demands for Large-Scale Science Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramamurthy, Byravamurthy

    2014-05-05

    In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less

  6. More similarities than differences in contemporary theories of social development?: a plea for theory bridging.

    PubMed

    Leaper, Campbell

    2011-01-01

    Many contemporary theories of social development are similar and/or share complementary constructs. Yet, there have been relatively few efforts toward theoretical integration. The present chapter represents a call for increased theory bridging. The problem of theoretical fragmentation in psychology is reviewed. Seven highlighted reasons for this predicament include differences between behavioral sciences and other sciences, theoretical paradigms as social identities, the uniqueness assumption, information overload, field fixation, linguistic fragmentation, and few incentives for theoretical integration. Afterward, the feasibility of theoretical synthesis is considered. Finally, some possible directions are proposed for theoretical integration among five contemporary theories of social and gender development: social cognitive theory, expectancy-value theory, cognitive-developmental theory, gender schema theory, and self-categorization theory.

  7. Science and Science Fiction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scherrer, Robert

    2006-03-29

    I will explore the similarities and differences between the process of writing science fiction and the process of 'producing' science, specifically theoretical physics. What are the ground rules for introducing unproven new ideas in science fiction, and how do they differ from the corresponding rules in physics? How predictive is science fiction? (For that matter, how predictive is theoretical physics?) I will also contrast the way in which information is presented in science fiction, as opposed to its presentation in scientific papers, and I will examine the relative importance of ideas (as opposed to the importance of the way inmore » which these ideas are presented). Finally, I will discuss whether a background as a research scientist provides any advantage in writing science fiction.« less

  8. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  9. 78 FR 10180 - Annual Computational Science Symposium; Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...

  10. Aqueous Cation-Amide Binding: Free Energies and IR Spectral Signatures by Ab Initio Molecular Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pluharova, Eva; Baer, Marcel D.; Mundy, Christopher J.

    2014-07-03

    Understanding specific ion effects on proteins remains a considerable challenge. N-methylacetamide serves as a useful proxy for the protein backbone that can be well characterized both experimentally and theoretically. The spectroscopic signatures in the amide I band reflecting the strength of the interaction of alkali cations and alkali earth dications with the carbonyl group remain difficult to assign and controversial to interpret. Herein, we directly compute the IR shifts corresponding to the binding of either sodium or calcium to aqueous N-methylacetamide using ab initio molecular dynamics simulations. We show that the two cations interact with aqueous N-methylacetamide with different affinitiesmore » and in different geometries. Since sodium exhibits a weak interaction with the carbonyl group, the resulting amide I band is similar to an unperturbed carbonyl group undergoing aqueous solvation. In contrast, the stronger calcium binding results in a clear IR shift with respect to N-methylacetamide in pure water. Support from the Czech Ministry of Education (grant LH12001) is gratefully acknowledged. EP thanks the International Max-Planck Research School for support and the Alternative Sponsored Fellowship program at Pacific Northwest National Laboratory (PNNL). PJ acknowledges the Praemium Academie award from the Academy of Sciences. Calculations of the free energy profiles were made possible through generous allocation of computer time from the North-German Supercomputing Alliance (HLRN). Calculations of vibrational spectra were performed in part using the computational resources in the National Energy Research Supercomputing Center (NERSC) at Lawrence Berkeley National Laboratory. This work was supported by National Science Foundation grant CHE-0431312. CJM is supported by the U.S. Department of Energy`s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. PNNL is operated for the Department of Energy by Battelle. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL.« less

  11. Participatory Design of Human-Centered Cyberinfrastructure (Invited)

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gates, A. Q.

    2010-12-01

    Cyberinfrastructure, by definition, is about people sharing resources to achieve outcomes that cannot be reached independently. CI depends not just on creating discoverable resources, or tools that allow those resources to be processed, integrated, and visualized -- but on human activation of flows of information across those resources. CI must be centered on human activities. Yet for those CI projects that are directed towards observational science, there are few models for organizing collaborative research in ways that align individual research interests into a collective vision of CI-enabled science. Given that the emerging technologies are themselves expected to change the way science is conducted, it is not simply a matter of conducting requirements analysis on how scientists currently work, or building consensus among the scientists on what is needed. Developing effective CI depends on generating a new, creative vision of problem solving within a community based on computational concepts that are, in some cases, still very abstract and theoretical. The computer science theory may (or may not) be well formalized, but the potential for impact on any particular domain is typically ill-defined. In this presentation we will describe approaches being developed and tested at the CyberShARE Center of Excellence at University of Texas in El Paso for ill-structured problem solving within cross-disciplinary teams of scientists and computer scientists working on data intensive environmental and geoscience. These approaches deal with the challenges associated with sharing and integrating knowledge across disciplines; the challenges of developing effective teamwork skills in a culture that favors independent effort; and the challenges of evolving shared, focused research goals from ill-structured, vague starting points - all issues that must be confronted by every interdisciplinary CI project. We will introduce visual and semantic-based tools that can enable the collaborative research design process and illustrate their application in designing and developing useful end-to-end data solutions for scientists. Lastly, we will outline areas of future investigation within CyberShARE that we believe have the potential for high impact.

  12. Computational design of materials for solar hydrogen generation

    NASA Astrophysics Data System (ADS)

    Umezawa, Naoto

    Photocatalysis has a great potential for the production of hydrogen from aquerous solution under solar light. In this talk, two different approaches toward the computational materials desing for solar hydrogen generation will be presented. Tin (Sn), which has two major oxidation states, Sn2+ and Sn4+, is abundant on the earth's crust. Recently, visible-light responsive photocatalytc H2 evolution reaction was identified over a mixed valence tin oxide Sn3O4. We have carried out crystal structure prediction for mixed valence tin oxides in different atomic compositions under ambient pressure condition using advanced computational methods based on the evolutionary crystal-structure search and density-functional theory. The predicted novel crystal structures realize the desirable band gaps and band edge positions for H2 evolution under visible light irradiation. It is concluded that multivalent tin oxides have a great potential as an abundant, cheap and environmentally-benign solar-energy conversion photofunctional materials. Transition metal doping is effective for sensitizing SrTiO3 under visible light. We have theoretically investigated the roles of the doped Cr in STO based on hybrid density-functional calculations. Cr atoms are preferably substituting for Ti under any equilibrium growth conditions. The lower oxidation state Cr3+, which is stabilized under an n-type condition of STO, is found to be advantageous for the photocatalytic performance. It is firther predicted that lanthanum is the best codopant for stabilizing the favorable oxidation state, Cr3+. The prediction was validated by our experiments that La and Cr co-doped STO shows the best performance among examined samples. This work was supported by the Japan Science and Technology Agency (JST) Precursory Research for Embryonic Science and Technology (PRESTO) and International Research Fellow program of Japan Society for the Promotion of Science (JSPS) through project P14207.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  14. ISTP Science Data Systems and Products

    NASA Astrophysics Data System (ADS)

    Mish, William H.; Green, James L.; Reph, Mary G.; Peredo, Mauricio

    1995-02-01

    The International Solar-Terrestrial Physics (ISTP) program will provide simultaneous coordinated scientific measurements from most of the major areas of geospace including specific locations on the Earth's surface. This paper describes the comprehensive ISTP ground science data handling system which has been developed to promote optimal mission planning and efficient data processing, analysis and distribution. The essential components of this ground system are the ISTP Central Data Handling Facility (CDHF), the Information Processing Division's Data Distribution Facility (DDF), the ISTP/Global Geospace Science (GGS) Science Planning and Operations Facility (SPOF) and the NASA Data Archive and Distribution Service (NDADS). The ISTP CDHF is the one place in the program where measurements from this wide variety of geospace and ground-based instrumentation and theoretical studies are brought together. Subsequently, these data will be distributed, along with ancillary data, in a unified fashion to the ISTP Principal Investigator (PI) and Co-Investigator (CoI) teams for analysis on their local systems. The CDHF ingests the telemetry streams, orbit, attitude, and command history from the GEOTAIL, WIND, POLAR, SOHO, and IMP-8 Spacecraft; computes summary data sets, called Key Parameters (KPs), for each scientific instrument; ingests pre-computed KPs from other spacecraft and ground basel investigations; provides a computational platform for parameterized modeling; and provides a number of ‘data services” for the ISTP community of investigators. The DDF organizes the KPs, decommutated telemetry, and associated ancillary data into products for duistribution to the ISTP community on CD-ROMs. The SPOF is the component of the GGS program responsible for the development and coordination of ISTP science planning operations. The SPOF operates under the direction of the ISTP Project Scientist and is responsible for the development and coordination of the science plan for ISTP spacecraft. Instrument command requests for the WIND and POLAR investigations are submitted by the PIs to the SPOF where they are checked for science conflicts, forwarded to the GSFC Command Management Syntem/Payload Operations Control Center (CMS/POCC) for engineering conflict validation, and finally incorporated into the conflict-free science operations plan. Conflict resolution is accomplished through iteration between the PIs, SPOF and CMS and in consultation with the Project Scientist when necessary. The long term archival of ISTP KP and level-zero data will be undertaken by NASA's National Space Science Data Center using the NASA Data Archive and Distribution Service (NDADS). This on-line archive facility will provide rapid access to archived KPs and event data and includes security features to restrict access to the data during the time they are proprietary.

  15. Citizen social science: a methodology to facilitate and evaluate workplace learning in continuing interprofessional education.

    PubMed

    Dadich, Ann

    2014-05-01

    Workplace learning in continuing interprofessional education (CIPE) can be difficult to facilitate and evaluate, which can create a number of challenges for this type of learning. This article presents an innovative method to foster and investigate workplace learning in CIPE - citizen social science. Citizen social science involves clinicians as co-researchers in the systematic examination of social phenomena. When facilitated by an open-source online social networking platform, clinicians can participate via computer, smartphone, or tablet in ways that suit their needs and preferences. Furthermore, as co-researchers they can help to reveal the dynamic interplay that facilitates workplace learning in CIPE. Although yet to be tested, citizen social science offers four potential benefits: it recognises and accommodates the complexity of workplace learning in CIPE; it has the capacity to both foster and evaluate the phenomena; it can be used in situ, capturing and having direct relevance to the complexity of the workplace; and by advancing both theoretical and methodological debates on CIPE, it may reveal opportunities to improve and sustain workplace learning. By describing an example situated in the youth health sector, this article demonstrates how these benefits might be realised.

  16. Theoretical Problems in Materials Science

    NASA Technical Reports Server (NTRS)

    Langer, J. S.; Glicksman, M. E.

    1985-01-01

    Interactions between theoretical physics and material sciences to identify problems of common interest in which some of the powerful theoretical approaches developed for other branches of physics may be applied to problems in materials science are presented. A unique structure was identified in rapidly quenched Al-14% Mn. The material has long-range directed bonds with icosahedral symmetry which does not form a regular structure but instead forms an amorphous-like quasiperiodic structure. Finite volume fractions of second phase material is advanced and is coupled with nucleation theory to describe the formation and structure of precipitating phases in alloys. Application of the theory of pattern formation to the problem of dendrite formation is studied.

  17. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  18. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  19. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  20. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  1. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  2. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  3. High school computer science education paves the way for higher education: the Israeli case

    NASA Astrophysics Data System (ADS)

    Armoni, Michal; Gal-Ezer, Judith

    2014-07-01

    The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.

  4. Exploring How Globalization Shapes Education: Methodology and Theoretical Framework

    ERIC Educational Resources Information Center

    Pan, Su-Yan

    2010-01-01

    This is a commentary on some major issues raised in Carter and Dediwalage's "Globalisation and science education: The case of "Sustainability by the bay"" (this issue), particularly their methodology and theoretical framework for understanding how globalisation shapes education (including science education). While acknowledging the authors'…

  5. Van Kampen Colimits as Bicolimits in Span

    NASA Astrophysics Data System (ADS)

    Heindel, Tobias; Sobociński, Paweł

    The exactness properties of coproducts in extensive categories and pushouts along monos in adhesive categories have found various applications in theoretical computer science, e.g. in program semantics, data type theory and rewriting. We show that these properties can be understood as a single universal property in the associated bicategory of spans. To this end, we first provide a general notion of Van Kampen cocone that specialises to the above colimits. The main result states that Van Kampen cocones can be characterised as exactly those diagrams in ℂ that induce bicolimit diagrams in the bicategory of spans mathcal{S}pan_{mathbb{C}}, provided that ℂ has pullbacks and enough colimits.

  6. The Science and Technology of the US National Missile Defense System

    NASA Astrophysics Data System (ADS)

    Postol, Theodore A.

    2010-03-01

    The National Missile Defense System utilizes UHF and X-band radars for search, track and discrimination, and interceptors that use long-wave infrared sensors to identify and home on attacking warheads. The radars and infrared sensors in the missile defense system perform at near the theoretical limits predicted by physics. However, in spite of the fantastic technical advances in sensor technology, signal processing, and computational support functions, the National Missile Defense System cannot be expected to ever work in realistic combat environments. This talk will describe why these impressive technologies can never deliver on the promise of a credible defense against long-range ballistic missiles.

  7. A practice course to cultivate students' comprehensive ability of photoelectricity

    NASA Astrophysics Data System (ADS)

    Lv, Yong; Liu, Yang; Niu, Chunhui; Liu, Lishuang

    2017-08-01

    After the studying of many theoretical courses, it's important and urgent for the students from specialty of optoelectronic information science and engineering to cultivate their comprehensive ability of photoelectricity. We set up a comprehensive practice course named "Integrated Design of Optoelectronic Information System" (IDOIS) for the purpose that students can integrate their knowledge of optics, electronics and computer programming to design, install and debug an optoelectronic system with independent functions. Eight years of practice shows that this practice course can train students' ability of analysis, design/development and debugging of photoelectric system, improve their ability in document retrieval, design proposal and summary report writing, teamwork, innovation consciousness and skill.

  8. Response function of modulated grid Faraday cup plasma instruments

    NASA Technical Reports Server (NTRS)

    Barnett, A.; Olbert, S.

    1986-01-01

    Modulated grid Faraday cup plasma analyzers are a very useful tool for making in situ measurements of space plasmas. One of their great attributes is that their simplicity permits their angular response function to be calculated theoretically. An expression is derived for this response function by computing the trajectories of the charged particles inside the cup. The Voyager plasma science experiment is used as a specific example. Two approximations to the rigorous response function useful for data analysis are discussed. Multisensor analysis of solar wind data indicates that the formulas represent the true cup response function for all angles of incidence with a maximum error of only a few percent.

  9. Nicholas Brunhart-Lupo | NREL

    Science.gov Websites

    . Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov

  10. The Need for Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  11. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  12. Proof test of the computer program BUCKY for plasticity problems

    NASA Technical Reports Server (NTRS)

    Smith, James P.

    1994-01-01

    A theoretical equation describing the elastic-plastic deformation of a cantilever beam subject to a constant pressure is developed. The theoretical result is compared numerically to the computer program BUCKY for the case of an elastic-perfectly plastic specimen. It is shown that the theoretical and numerical results compare favorably in the plastic range. Comparisons are made to another research code to further validate the BUCKY results. This paper serves as a quality test for the computer program BUCKY developed at NASA Johnson Space Center.

  13. Computer program for assessing the theoretical performance of a three dimensional inlet

    NASA Technical Reports Server (NTRS)

    Agnone, A. M.; Kung, F.

    1972-01-01

    A computer program for determining the theoretical performance of a three dimensional inlet is presented. An analysis for determining the capture area, ram force, spillage force, and surface pressure force is presented, along with the necessary computer program. A sample calculation is also included.

  14. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  15. PREFACE: 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics & 38th National Conference on Theoretical Physics

    NASA Astrophysics Data System (ADS)

    2014-09-01

    This volume contains selected papers presented at the 38th National Conference on Theoretical Physics (NCTP-38) and the 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics (IWTCP-1). Both the conference and the workshop were held from 29 July to 1 August 2013 in Pullman hotel, Da Nang, Vietnam. The IWTCP-1 was a new activity of the Vietnamese Theoretical Physics Society (VTPS) organized in association with the 38th National Conference on Theoretical Physics (NCTP-38), the most well-known annual scientific forum dedicated to the dissemination of the latest development in the field of theoretical physics within the country. The IWTCP-1 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). The overriding goal of the IWTCP is to provide an international forum for scientists and engineers from academia to share ideas, problems and solution relating to the recent advances in theoretical physics as well as in computational physics. The main IWTCP motivation is to foster scientific exchanges between the Vietnamese theoretical and computational physics community and world-wide scientists as well as to promote high-standard level of research and education activities for young physicists in the country. About 110 participants coming from 10 countries participated in the conference and the workshop. 4 invited talks, 18 oral contributions and 46 posters were presented at the conference. In the workshop we had one keynote lecture and 9 invited talks presented by international experts in the fields of theoretical and computational physics, together with 14 oral and 33 poster contributions. The proceedings were edited by Nguyen Tri Lan, Trinh Xuan Hoang, and Nguyen Ai Viet. We would like to thank all invited speakers, participants and sponsors for making the conference and the workshop successful. Nguyen Ai Viet Chair of NCTP-38 and IWTCP-1

  16. Sequential visibility-graph motifs

    NASA Astrophysics Data System (ADS)

    Iacovacci, Jacopo; Lacasa, Lucas

    2016-04-01

    Visibility algorithms transform time series into graphs and encode dynamical information in their topology, paving the way for graph-theoretical time series analysis as well as building a bridge between nonlinear dynamics and network science. In this work we introduce and study the concept of sequential visibility-graph motifs, smaller substructures of n consecutive nodes that appear with characteristic frequencies. We develop a theory to compute in an exact way the motif profiles associated with general classes of deterministic and stochastic dynamics. We find that this simple property is indeed a highly informative and computationally efficient feature capable of distinguishing among different dynamics and robust against noise contamination. We finally confirm that it can be used in practice to perform unsupervised learning, by extracting motif profiles from experimental heart-rate series and being able, accordingly, to disentangle meditative from other relaxation states. Applications of this general theory include the automatic classification and description of physical, biological, and financial time series.

  17. Multi-scale computation methods: Their applications in lithium-ion battery research and development

    NASA Astrophysics Data System (ADS)

    Siqi, Shi; Jian, Gao; Yue, Liu; Yan, Zhao; Qu, Wu; Wangwei, Ju; Chuying, Ouyang; Ruijuan, Xiao

    2016-01-01

    Based upon advances in theoretical algorithms, modeling and simulations, and computer technologies, the rational design of materials, cells, devices, and packs in the field of lithium-ion batteries is being realized incrementally and will at some point trigger a paradigm revolution by combining calculations and experiments linked by a big shared database, enabling accelerated development of the whole industrial chain. Theory and multi-scale modeling and simulation, as supplements to experimental efforts, can help greatly to close some of the current experimental and technological gaps, as well as predict path-independent properties and help to fundamentally understand path-independent performance in multiple spatial and temporal scales. Project supported by the National Natural Science Foundation of China (Grant Nos. 51372228 and 11234013), the National High Technology Research and Development Program of China (Grant No. 2015AA034201), and Shanghai Pujiang Program, China (Grant No. 14PJ1403900).

  18. Network neuroscience

    PubMed Central

    Bassett, Danielle S; Sporns, Olaf

    2017-01-01

    Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844

  19. REVIEW ARTICLE: The next 50 years of the SI: a review of the opportunities for the e-Science age

    NASA Astrophysics Data System (ADS)

    Foster, Marcus P.

    2010-12-01

    The International System of Units (SI) was declared as a practical and evolving system in 1960 and is now 50 years old. A large amount of theoretical and experimental work has been conducted to change the standards for the base units from artefacts to physical constants, to improve their stability and reproducibility. Less attention, however, has been paid to improving the SI definitions, utility and usability, which suffer from contradictions, ambiguities and inconsistencies. While humans can often resolve these issues contextually, computers cannot. As an ever-increasing volume and proportion of data about physical quantities is collected, exchanged, processed and rendered by computers, this paper argues that the SI definitions, symbols and syntax should be made more rigorous, so they can be represented wholly and unambiguously in ontologies, programs, data and text, and so the SI notation can be rendered faithfully in print and on screen.

  20. Curricular Influences on Female Afterschool Facilitators' Computer Science Interests and Career Choices

    NASA Astrophysics Data System (ADS)

    Koch, Melissa; Gorges, Torie

    2016-10-01

    Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.

  1. The NASA computer science research program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  2. Measuring Primary Teachers' Attitudes Toward Teaching Science: Development of the Dimensions of Attitude Toward Science (DAS) Instrument

    NASA Astrophysics Data System (ADS)

    van Aalderen-Smeets, Sandra; Walma van der Molen, Juliette

    2013-03-01

    In this article, we present a valid and reliable instrument which measures the attitude of in-service and pre-service primary teachers toward teaching science, called the Dimensions of Attitude Toward Science (DAS) Instrument. Attention to the attitudes of primary teachers toward teaching science is of fundamental importance to the professionalization of these teachers in the field of primary science education. With the development of this instrument, we sought to fulfill the need for a statistically and theoretically valid and reliable instrument to measure pre-service and in-service teachers' attitudes. The DAS Instrument is based on a comprehensive theoretical framework for attitude toward (teaching) science. After pilot testing, the DAS was revised and subsequently validated using a large group of respondents (pre-service and in-service primary teachers) (N = 556). The theoretical underpinning of the DAS combined with the statistical data indicate that the DAS possesses good construct validity and that it proves to be a promising instrument that can be utilized for research purposes, and also as a teacher training and coaching tool. This instrument can therefore make a valuable contribution to progress within the field of science education.

  3. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  4. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  5. Tilted Axis Rotation of 57Mn in Covariant Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Peng, Jing; Xu, Wen-Qiang

    2016-01-01

    Not Available Supported by the National Natural Science Foundation of China under Grant No 11461141002, and the Open Project Program of State Key Laboratory of Theoretical Physics of Institute of Theoretical Physics of Chinese Academy of Sciences under Grant No Y4KF041CJ1.

  6. Initiation Mechanism of Kinesin’s Neck Linker Docking Process

    NASA Astrophysics Data System (ADS)

    Geng, Yi-Zhao; Zhang, Hui; Lyu, Gang; Ji, Qing

    2017-02-01

    Not Available Supported by the National Natural Science Foundation of China under Grant Nos 11545014 and 11605038, and the Open Project Program of State Key Laboratory of Theoretical Physics of Institute of Theoretical Physics of Chinese Academy of Science under Grant No Y5KF211CJ1.

  7. Stability Analysis of Finite Difference Approximations to Hyperbolic Systems, and Problems in Applied and Computational Matrix Theory

    DTIC Science & Technology

    1988-07-08

    Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36

  8. Empirical Determination of Competence Areas to Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia

    2014-01-01

    The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…

  9. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  10. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  11. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  12. Use and Acceptance of Information and Communication Technology Among Laboratory Science Students

    NASA Astrophysics Data System (ADS)

    Barnes, Brenda C.

    Online and blended learning platforms are being promoted within laboratory science education under the assumption that students have the necessary skills to navigate online and blended learning environments. Yet little research has examined the use of information and communication technology (ICT) among the laboratory science student population. The purpose of this correlational, survey research study was to explore factors that affect use and acceptance of ICT among laboratory science students through the theoretical lens of the unified theory of acceptance and use of technology (UTAUT) model. An electronically delivered survey drew upon current students and recent graduates (within 2 years) of accredited laboratory science training programs. During the 4 week data collection period, 168 responses were received. Results showed that the UTAUT model did not perform well within this study, explaining 25.2% of the variance in use behavior. A new model incorporating attitudes toward technology and computer anxiety as two of the top variables, a model significantly different from the original UTAUT model, was developed that explained 37.0% of the variance in use behavior. The significance of this study may affect curriculum design of laboratory science training programs wanting to incorporate more teaching techniques that use ICT-based educational delivery, and provide more options for potential students who may not currently have access to this type of training.

  13. Creating the brain and interacting with the brain: an integrated approach to understanding the brain.

    PubMed

    Morimoto, Jun; Kawato, Mitsuo

    2015-03-06

    In the past two decades, brain science and robotics have made gigantic advances in their own fields, and their interactions have generated several interdisciplinary research fields. First, in the 'understanding the brain by creating the brain' approach, computational neuroscience models have been applied to many robotics problems. Second, such brain-motivated fields as cognitive robotics and developmental robotics have emerged as interdisciplinary areas among robotics, neuroscience and cognitive science with special emphasis on humanoid robots. Third, in brain-machine interface research, a brain and a robot are mutually connected within a closed loop. In this paper, we review the theoretical backgrounds of these three interdisciplinary fields and their recent progress. Then, we introduce recent efforts to reintegrate these research fields into a coherent perspective and propose a new direction that integrates brain science and robotics where the decoding of information from the brain, robot control based on the decoded information and multimodal feedback to the brain from the robot are carried out in real time and in a closed loop. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  14. PREFACE: VI Scientific Technical Conference on "Low-temperature plasma during the deposition of functional coatings"

    NASA Astrophysics Data System (ADS)

    2014-11-01

    The VI Republican Scientific Technical Conference "Low-temperature plasma during the deposition of functional coatings" took place from 4 to 7 November 2014 at the Academy of Sciences of the Republic of Tatarstan and the Kazan Federal University. The conference was chaired by a Member of the Academy of Sciences of the Republic of Tatarstan Nail Kashapov -Professor, Doctor of Technical Sciences- a member of the Scientific and Technical Council of the Ministry of Economy of the Republic of Tatarstan. At the conference, the participants discussed a wide range of issues affecting the theoretical and computational aspects of research problems in the physics and technology of low-temperature plasma. A series of works were devoted to the study of thin films obtained by low-temperature plasma. This year work dedicated to the related field of heat mass transfer in multiphase media and low-temperature plasma was also presented. Of special interest were reports on the exploration of gas discharges with liquid electrolytic electrotrodes and the study of dusty plasmas. Kashapov Nail, D.Sc., Professor (Kazan Federal University)

  15. Significant Advances in the AIRS Science Team Version-6 Retrieval Algorithm

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Blaisdell, John; Iredell, Lena; Molnar, Gyula

    2012-01-01

    AIRS/AMSU is the state of the art infrared and microwave atmospheric sounding system flying aboard EOS Aqua. The Goddard DISC has analyzed AIRS/AMSU observations, covering the period September 2002 until the present, using the AIRS Science Team Version-S retrieval algorithm. These products have been used by many researchers to make significant advances in both climate and weather applications. The AIRS Science Team Version-6 Retrieval, which will become operation in mid-20l2, contains many significant theoretical and practical improvements compared to Version-5 which should further enhance the utility of AIRS products for both climate and weather applications. In particular, major changes have been made with regard to the algOrithms used to 1) derive surface skin temperature and surface spectral emissivity; 2) generate the initial state used to start the retrieval procedure; 3) compute Outgoing Longwave Radiation; and 4) determine Quality Control. This paper will describe these advances found in the AIRS Version-6 retrieval algorithm and demonstrate the improvement of AIRS Version-6 products compared to those obtained using Version-5,

  16. Creating the brain and interacting with the brain: an integrated approach to understanding the brain

    PubMed Central

    Morimoto, Jun; Kawato, Mitsuo

    2015-01-01

    In the past two decades, brain science and robotics have made gigantic advances in their own fields, and their interactions have generated several interdisciplinary research fields. First, in the ‘understanding the brain by creating the brain’ approach, computational neuroscience models have been applied to many robotics problems. Second, such brain-motivated fields as cognitive robotics and developmental robotics have emerged as interdisciplinary areas among robotics, neuroscience and cognitive science with special emphasis on humanoid robots. Third, in brain–machine interface research, a brain and a robot are mutually connected within a closed loop. In this paper, we review the theoretical backgrounds of these three interdisciplinary fields and their recent progress. Then, we introduce recent efforts to reintegrate these research fields into a coherent perspective and propose a new direction that integrates brain science and robotics where the decoding of information from the brain, robot control based on the decoded information and multimodal feedback to the brain from the robot are carried out in real time and in a closed loop. PMID:25589568

  17. Coherent Rayleigh-Brillouin scattering for in situ detection of nanoparticles and large molecules in gas and plasma

    NASA Astrophysics Data System (ADS)

    Gerakis, A.; Shneider, M. N.; Stratton, B. C.; Santra, B.; Car, R.; Raitses, Y.

    2016-09-01

    Laser-based diagnostics methods, such as Spontaneous and Coherent Rayleigh and Rayleigh-Brillouin scattering (SRBS and CRBS), can be used for in-situ detection and characterization of nanoparticle shape and size as well as their concentration in an inert gas atmosphere. We recently developed and tested this advanced diagnostic at PPPL. It was shown that the signal intensity of the CRBS signal depends on the gas-nanoparticle mixture composition, density and the polarizabilities of the mixture components. The measured results agree well with theoretical predictions of Refs. In this work, we report the application of this diagnostic to monitor nucleation and growth of nanoparticles in a carbon arc discharge. In support of these measurements, A time-dependent density functional theory was used to compute the frequency-dependent polarizabilities of various nanostructures in order to predict the corresponding Rayleigh scattering intensities as well as light depolarization. Preliminary results of measurements demonstrate that CRBS is capable to detect nanoparticles in volume. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division.

  18. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  19. A Game Theoretical Approach to Hacktivism: Is Attack Likelihood a Product of Risks and Payoffs?

    PubMed

    Bodford, Jessica E; Kwan, Virginia S Y

    2018-02-01

    The current study examines hacktivism (i.e., hacking to convey a moral, ethical, or social justice message) through a general game theoretic framework-that is, as a product of costs and benefits. Given the inherent risk of carrying out a hacktivist attack (e.g., legal action, imprisonment), it would be rational for the user to weigh these risks against perceived benefits of carrying out the attack. As such, we examined computer science students' estimations of risks, payoffs, and attack likelihood through a game theoretic design. Furthermore, this study aims at constructing a descriptive profile of potential hacktivists, exploring two predicted covariates of attack decision making, namely, peer prevalence of hacking and sex differences. Contrary to expectations, results suggest that participants' estimations of attack likelihood stemmed solely from expected payoffs, rather than subjective risks. Peer prevalence significantly predicted increased payoffs and attack likelihood, suggesting an underlying descriptive norm in social networks. Notably, we observed no sex differences in the decision to attack, nor in the factors predicting attack likelihood. Implications for policymakers and the understanding and prevention of hacktivism are discussed, as are the possible ramifications of widely communicated payoffs over potential risks in hacking communities.

  20. Close contacts at the interface: Experimental-computational synergies for solving complexity problems

    NASA Astrophysics Data System (ADS)

    Torras, Juan; Zanuy, David; Bertran, Oscar; Alemán, Carlos; Puiggalí, Jordi; Turón, Pau; Revilla-López, Guillem

    2018-02-01

    The study of material science has been long devoted to the disentanglement of bulk structures which mainly entails finding the inner structure of materials. That structure is accountable for a major portion of materials' properties. Yet, as our knowledge of these "backbones" enlarged so did the interest for the materials' boundaries properties which means the properties at the frontier with the surrounding environment that is called interface. The interface is thus to be understood as the sum of the material's surface plus the surrounding environment be it in solid, liquid or gas phase. The study of phenomena at this interface requires both the use of experimental and theoretical techniques and, above all, a wise combination of them in order to shed light over the most intimate details at atomic, molecular and mesostructure levels. Here, we report several cases to be used as proof of concept of the results achieved when studying interface phenomena by combining a myriad of experimental and theoretical tools to overcome the usual limitation regardind atomic detail, size and time scales and systems of complex composition. Real world examples of the combined experimental-theoretical work and new tools, software, is offered to the readers.

  1. A Science Cloud: OneSpaceNet

    NASA Astrophysics Data System (ADS)

    Morikawa, Y.; Murata, K. T.; Watari, S.; Kato, H.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Shimojo, S.

    2010-12-01

    Main methodologies of Solar-Terrestrial Physics (STP) so far are theoretical, experimental and observational, and computer simulation approaches. Recently "informatics" is expected as a new (fourth) approach to the STP studies. Informatics is a methodology to analyze large-scale data (observation data and computer simulation data) to obtain new findings using a variety of data processing techniques. At NICT (National Institute of Information and Communications Technology, Japan) we are now developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment specialized for science works, which connects many researchers with high-speed network (JGN: Japan Gigabit Network). The JGN is a wide-area back-born network operated by NICT; it provides 10G network and many access points (AP) over Japan. The OneSpaceNet also provides with rich computer resources for research studies, such as super-computers, large-scale data storage area, licensed applications, visualization devices (like tiled display wall: TDW), database/DBMS, cluster computers (4-8 nodes) for data processing and communication devices. What is amazing in use of the science cloud is that a user simply prepares a terminal (low-cost PC). Once connecting the PC to JGN2plus, the user can make full use of the rich resources of the science cloud. Using communication devices, such as video-conference system, streaming and reflector servers, and media-players, the users on the OneSpaceNet can make research communications as if they belong to a same (one) laboratory: they are members of a virtual laboratory. The specification of the computer resources on the OneSpaceNet is as follows: The size of data storage we have developed so far is almost 1PB. The number of the data files managed on the cloud storage is getting larger and now more than 40,000,000. What is notable is that the disks forming the large-scale storage are distributed to 5 data centers over Japan (but the storage system performs as one disk). There are three supercomputers allocated on the cloud, one from Tokyo, one from Osaka and the other from Nagoya. One's simulation job data on any supercomputers are saved on the cloud data storage (same directory); it is a kind of virtual computing environment. The tiled display wall has 36 panels acting as one display; the pixel (resolution) size of it is as large as 18000x4300. This size is enough to preview or analyze the large-scale computer simulation data. It also allows us to take a look of multiple (e.g., 100 pictures) on one screen together with many researchers. In our talk we also present a brief report of the initial results using the OneSpaceNet for Global MHD simulations as an example of successful use of our science cloud; (i) Ultra-high time resolution visualization of Global MHD simulations on the large-scale storage and parallel processing system on the cloud, (ii) Database of real-time Global MHD simulation and statistic analyses of the data, and (iii) 3D Web service of Global MHD simulations.

  2. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  3. Conceptualizing, Designing, and Investigating Locative Media Use in Urban Space

    NASA Astrophysics Data System (ADS)

    Diamantaki, Katerina; Rizopoulos, Charalampos; Charitos, Dimitris; Kaimakamis, Nikos

    This chapter investigates the social implications of locative media (LM) use and attempts to outline a theoretical framework that may support the design and implementation of location-based applications. Furthermore, it stresses the significance of physical space and location awareness as important factors that influence both human-computer interaction and computer-mediated communication. The chapter documents part of the theoretical aspect of the research undertaken as part of LOcation-based Communication Urban NETwork (LOCUNET), a project that aims to investigate the way users interact with one another (human-computer-human interaction aspect) and with the location-based system itself (human-computer interaction aspect). A number of relevant theoretical approaches are discussed in an attempt to provide a holistic theoretical background for LM use. Additionally, the actual implementation of the LOCUNET system is described and some of the findings are discussed.

  4. Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender

    NASA Astrophysics Data System (ADS)

    Larsen, Elizabeth A.; Stubbs, Margaret L.

    Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.

  5. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  6. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  7. Computer Science and the Liberal Arts

    ERIC Educational Resources Information Center

    Shannon, Christine

    2010-01-01

    Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…

  8. Marrying Content and Process in Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2011-01-01

    Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…

  9. Computing Whether She Belongs: Stereotypes Undermine Girls' Interest and Sense of Belonging in Computer Science

    ERIC Educational Resources Information Center

    Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.

    2016-01-01

    Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…

  10. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    ERIC Educational Resources Information Center

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  11. Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.

    ERIC Educational Resources Information Center

    Turner, Judith Axler

    1987-01-01

    Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)

  12. African-American males in computer science---Examining the pipeline for clogs

    NASA Astrophysics Data System (ADS)

    Stone, Daryl Bryant

    The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.

  13. Girls in computer science: A female only introduction class in high school

    NASA Astrophysics Data System (ADS)

    Drobnis, Ann W.

    This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.

  14. Comparison of real and computer-simulated outcomes of LASIK refractive surgery

    NASA Astrophysics Data System (ADS)

    Cano, Daniel; Barbero, Sergio; Marcos, Susana

    2004-06-01

    Computer simulations of alternative LASIK ablation patterns were performed for corneal elevation maps of 13 real myopic corneas (range of myopia, -2.0 to -11.5 D). The computationally simulated ablation patterns were designed with biconic surfaces (standard Munnerlyn pattern, parabolic pattern, and biconic pattern) or with aberrometry measurements (customized pattern). Simulated results were compared with real postoperative outcomes. Standard LASIK refractive surgery for myopia increased corneal asphericity and spherical aberration. Computations with the theoretical Munnerlyn ablation pattern did not increase the corneal asphericity and spherical aberration. The theoretical parabolic pattern induced a slight increase of asphericity and spherical aberration, explaining only 40% of the clinically found increase. The theoretical biconic pattern controlled corneal spherical aberration. Computations showed that the theoretical customized pattern can correct high-order asymmetric aberrations. Simulations of changes in efficiency due to reflection and nonnormal incidence of the laser light showed a further increase in corneal asphericity. Consideration of these effects with a parabolic pattern accounts for 70% of the clinical increase in asphericity.

  15. Bringing computational science to the public.

    PubMed

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  16. Computer Science and Telecommunications Board summary of activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenthal, M.S.

    1992-03-27

    The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.

  17. Hispanic women overcoming deterrents to computer science: A phenomenological study

    NASA Astrophysics Data System (ADS)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.

  18. Shock waves and shock tubes; Proceedings of the Fifteenth International Symposium, Berkeley, CA, July 28-August 2, 1985

    NASA Technical Reports Server (NTRS)

    Bershader, D. (Editor); Hanson, R. (Editor)

    1986-01-01

    A detailed survey is presented of shock tube experiments, theoretical developments, and applications being carried out worldwide. The discussions explore shock tube physics and the related chemical, physical and biological science and technology. Extensive attention is devoted to shock wave phenomena in dusty gases and other multiphase and heterogeneous systems, including chemically reactive mixtures. Consideration is given to techniques for measuring, visualizing and theoretically modeling flowfield, shock wave and rarefaction wave characteristics. Numerical modeling is explored in terms of the application of computational fluid dynamics techniques to describing flowfields in shock tubes. Shock interactions and propagation, in both solids, fluids, gases and mixed media are investigated, along with the behavior of shocks in condensed matter. Finally, chemical reactions that are initiated as the result of passage of a shock wave are discussed, together with methods of controlling the evolution of laminar separated flows at concave corners on advanced reentry vehicles.

  19. Shock waves and shock tubes; Proceedings of the Fifteenth International Symposium, Berkeley, CA, July 28-August 2, 1985

    NASA Astrophysics Data System (ADS)

    Bershader, D.; Hanson, R.

    A detailed survey is presented of shock tube experiments, theoretical developments, and applications being carried out worldwide. The discussions explore shock tube physics and the related chemical, physical and biological science and technology. Extensive attention is devoted to shock wave phenomena in dusty gases and other multiphase and heterogeneous systems, including chemically reactive mixtures. Consideration is given to techniques for measuring, visualizing and theoretically modeling flowfield, shock wave and rarefaction wave characteristics. Numerical modeling is explored in terms of the application of computational fluid dynamics techniques to describing flowfields in shock tubes. Shock interactions and propagation, in both solids, fluids, gases and mixed media are investigated, along with the behavior of shocks in condensed matter. Finally, chemical reactions that are initiated as the result of passage of a shock wave are discussed, together with methods of controlling the evolution of laminar separated flows at concave corners on advanced reentry vehicles.

  20. Directional Hearing and Sound Source Localization in Fishes.

    PubMed

    Sisneros, Joseph A; Rogers, Peter H

    2016-01-01

    Evidence suggests that the capacity for sound source localization is common to mammals, birds, reptiles, and amphibians, but surprisingly it is not known whether fish locate sound sources in the same manner (e.g., combining binaural and monaural cues) or what computational strategies they use for successful source localization. Directional hearing and sound source localization in fishes continues to be important topics in neuroethology and in the hearing sciences, but the empirical and theoretical work on these topics have been contradictory and obscure for decades. This chapter reviews the previous behavioral work on directional hearing and sound source localization in fishes including the most recent experiments on sound source localization by the plainfin midshipman fish (Porichthys notatus), which has proven to be an exceptional species for fish studies of sound localization. In addition, the theoretical models of directional hearing and sound source localization for fishes are reviewed including a new model that uses a time-averaged intensity approach for source localization that has wide applicability with regard to source type, acoustic environment, and time waveform.

  1. Seismic to­mography; theory and practice

    USGS Publications Warehouse

    Iver, H.M.; Hirahara, Kazuro

    1993-01-01

    Although highly theoretical and computer-orientated, seismic tomography has created spectacular images of anomolies within the Earth with dimensions of thousands of kilometers to few tens of meters. These images have enabled Earth scientists working on diverse areas to attack fundamental problems relating to the deep dynamical processes within our planet. Additionally, this technique is being used extensively to study the Earth's hazardous regions such as earthquake fault zones and volcanoes, as well as features beneficial to man such as oil or mineral-bearing structures. This book has been written by world experts and describes the theories, experimental and analytical procedures and results of applying seismic tomography from global to purely local scale. It represents the collective global perspective on the state of the art and focusses not only on the theoretical and practical aspects, but also on the uses for hydrocarbon, mineral and geothermal exploitation. Students and researchers in the Earth sciences, and research and exploration geophysicists should find this a useful, practical reference book for all aspects of their work.

  2. PREFACE: 3rd Workshop on Theory, Modelling and Computational Methods for Semiconductors (TMCSIII)

    NASA Astrophysics Data System (ADS)

    Califano, Marco; Migliorato, Max; Probert, Matt

    2012-05-01

    These conference proceedings contain the written papers of the contributions presented at the 3rd International Conference on Theory, Modelling and Computational Methods for Semiconductor materials and nanostructures. The conference was held at the School of Electronic and Electrical Engineering, University of Leeds, Leeds, UK on 18-20 January 2012. The previous conferences in this series took place in 2010 at St William's College, York and in 2008 at the University of Manchester, UK. The development of high-speed computer architectures is finally allowing the routine use of accurate methods for calculating the structural, thermodynamic, vibrational, optical and electronic properties of semiconductors and their hetero- and nano-structures. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in semiconductor science and technology, where there is substantial potential for time-saving in R&D. Theoretical approaches represented in this meeting included: Density Functional Theory, Tight Binding, Semiempirical Pseudopotential Methods, Effective Mass Models, Empirical Potential Methods and Multiscale Approaches. Topics included, but were not limited to: Optical and Transport Properties of Quantum Nanostructures including Colloids and Nanotubes, Plasmonics, Magnetic Semiconductors, Graphene, Lasers, Photonic Structures, Photovoltaic and Electronic Devices. This workshop ran for three days, with the objective of bringing together UK and international leading experts in the theoretical modelling of Group IV, III-V and II-VI semiconductors, as well as students, postdocs and early-career researchers. The first day focused on providing an introduction and overview of this vast field, aimed particularly at students, with several lectures given by recognised experts in various theoretical approaches. The following two days showcased some of the best theoretical research carried out in the UK in this field, with several contributions also from representatives of renowned theoretical groups from many European countries (Spain, France, Ireland, Germany, Italy, Poland, Denmark, Sweden, Serbia, Greece, etc.), as well as Asia (India) and Africa (Algeria, Tunisia and South Africa). We would like to thank all participants for making this a very successful meeting and for their contribution to the conference programme and these proceedings. We would also like to acknowledge the financial support from the Institute of Physics (Computational Physics group and Semiconductor Physics group), and QuantumWise (distributors of Atomistix). The Editors Acknowledgments Conference Organising Committee: Marco Califano (University of Leeds) Max Migliorato (University of Manchester) Matt Probert (University of York) Programme Committee: Stewart Clark (University of Durham) Aldo Di Carlo (University of Rome 'Tor Vergata', Italy) Ben Hourahine (University of Strathclyde) Lev Kantorovich (King's College London) Risto Nieminen (Helsinki University of Technology, Finland) Eoin O'Reilly (Tyndall Institute Cork, Republic of Ireland) Mauro Pereira (Sheffield Hallam University) John Robertson (University of Cambridge) Mervin Roy (University of Leicester) Stanko Tomic (University of Salford) David Whittaker (University of Sheffield) The proceedings were edited and compiled by Marco Califano, Max Migliorato and Matt Probert.

  3. PREFACE: Euro-TMCS I: Theory, Modelling and Computational Methods for Semiconductors

    NASA Astrophysics Data System (ADS)

    Gómez-Campos, F. M.; Rodríguez-Bolívar, S.; Tomić, S.

    2015-05-01

    The present issue contains a selection of the best contributed works presented at the first Euro-TMCS conference (Theory, Modelling and Computational Methods for Semiconductors, European Session). The conference was held at Faculty of Sciences, Universidad de Granada, Spain on 28st-30st January 2015. This conference is the first European edition of the TMCS conference series which started in 2008 at the University of Manchester and has always been held in the United Kingdom. Four previous conferences have been previously carried out (Manchester 2008, York 2010, Leeds 2012 and Salford 2014). Euro-TMCS is run for three days; the first one devoted to giving invited tutorials, aimed particularly at students, on recent development of theoretical methods. On this occasion the session was focused on the presentation of widely-used computational methods for the modelling of physical processes in semiconductor materials. Freely available simulation software (SIESTA, Quantum Espresso and Yambo) as well as commercial software (TiberCad and MedeA) were presented in the conference by members of their development team, offering to the audience an overview of their capabilities for research. The second part of the conference showcased prestigious invited and contributed oral presentations, alongside poster sessions, in which direct discussion with authors was promoted. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in semiconductor science and technology. Theoretical approaches represented in this meeting included: Density Functional Theory, Semi-empirical Electronic Structure Methods, Multi-scale Approaches, Modelling of PV devices, Electron Transport, and Graphene. Topics included, but were not limited to: Optical Properties of Quantum Nanostructures including Colloids and Nanotubes, Plasmonics, Magnetic Semiconductors, Photonic Structures, and Electronic Devices. The Editors Acknowledgments: We would like to thank all participants for making this a very successful meeting and for their contribution to the conference programme and these proceedings. We would also like to acknowledge the financial support from Universidad de Granada, the CECAM UK-Hartree Node, project TEC2013-47283-R of Ministerio de Economía y Competitividad, and the company Materials Design (distributors of MedeA Software). Conference Organising Committee: Francisco M. Gómez-Campos (Co-chair, Universidad de Granada) Salvador Rodríguez-Bolívar (Co-chair, Universidad de Granada) Stanko Tomić (Co-chair, University of Salford)

  4. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  5. Vague Concepts in the Educational Sciences: Implications for Researchers

    ERIC Educational Resources Information Center

    Blikstad-Balas, Marte

    2014-01-01

    This article argues that many key theoretical concepts and core areas of study in the educational sciences are couched in paradigmatically vague terms. The shared features of vague terms and two different readings of vagueness are discussed. "Practice", which is widely used both as a theoretical and an empirical term in the field of…

  6. How Do Business and Government Interact? Combining Perspectives from Economics, Political Science, Public Administration, and Practitioners

    ERIC Educational Resources Information Center

    O'Neill, Patrick B.; Harsell, Dana Michael

    2015-01-01

    The authors describe the theoretical preparation provided to students in advance of a limited-duration experiential learning experience in Washington DC in a Master's level course for students in Business or Public Administration. The students consider theoretical perspectives from economics, political science, and public administration with…

  7. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  8. The Public Library User and the Charter Tourist: Two Travellers, One Analogy

    ERIC Educational Resources Information Center

    Eriksson, Catarina A. M.; Michnik, Katarina E.; Nordeborg, Yoshiko

    2013-01-01

    Introduction: A new theoretical model, relevant to library and information science, is implemented in this paper. The aim of this study is to contribute to the theoretical concepts of library and information science by introducing an ethnological model developed for investigating charter tourist styles thereby increasing our knowledge of users'…

  9. Protection-against-water-attack determined difference between strengths of backbone hydrogen bonds in kinesin’s neck zipper region

    NASA Astrophysics Data System (ADS)

    Qin, Jing-Yu; Geng, Yi-Zhao; Lü, Gang; Ji, Qing; Fang, Hai-Ping

    2018-02-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant No. 11605038) and the Open Project Program of State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of Sciences, China (Grant No. Y5KF211CJ1).

  10. Analysing Theoretical Frameworks of Moral Education through Lakatos's Philosophy of Science

    ERIC Educational Resources Information Center

    Han, Hyemin

    2014-01-01

    The structure of studies of moral education is basically interdisciplinary; it includes moral philosophy, psychology, and educational research. This article systematically analyses the structure of studies of moral educational from the vantage points of philosophy of science. Among the various theoretical frameworks in the field of philosophy of…

  11. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  12. Professional Development and Use of Digital Technologies by Science Teachers: a Review of Theoretical Frameworks

    NASA Astrophysics Data System (ADS)

    Fernandes, Geraldo W. Rocha; Rodrigues, António M.; Ferreira, Carlos Alberto

    2018-03-01

    This article aims to characterise the research on science teachers' professional development programs that support the use of Information and Communication Technologies (ICTs) and the main trends concerning the theoretical frameworks (theoretical foundation, literature review or background) that underpin these studies. Through a systematic review of the literature, 76 articles were found and divided into two axes on training science teachers and the use of digital technologies with their categories. The first axis (characterisation of articles) presents the category key features that characterise the articles selected (major subjects, training and actions for the professional development and major ICT tools and digital resources). The second axis (trends of theoretical frameworks) has three categories organised in theoretical frameworks that emphasise the following: (a) the digital technologies, (b) prospects of curricular renewal and (c) cognitive processes. It also characterised a group of articles with theoretical frameworks that contain multiple elements without deepening them or that even lack a theoretical framework that supports the studies. In this review, we found that many professional development programs for teachers still use inadequate strategies for bringing about change in teacher practices. New professional development proposals are emerging with the objective of minimising such difficulties and this analysis could be a helpful tool to restructure those proposals.

  13. Gender Differences in the Use of Computers, Programming, and Peer Interactions in Computer Science Classrooms

    ERIC Educational Resources Information Center

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-01-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…

  14. Opportunities for Computational Discovery in Basic Energy Sciences

    NASA Astrophysics Data System (ADS)

    Pederson, Mark

    2011-03-01

    An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~

  15. Research | Computational Science | NREL

    Science.gov Websites

    Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples

  16. Comparing the Pre- and Posttest Scores in Relations to the Emporium and the Hands-on Instructional Approaches of Teaching Science in Prekindergarten

    NASA Astrophysics Data System (ADS)

    Headen, Patricia Ann

    This quantitative, quasi-experimental research investigated if two instructional approaches, the Emporium Computer-Based (Group 2) versus the hands-on approach (Group 1), resulted any difference in student achievement in science for four-year-old prekindergarten students at a private childcare facility in North Carolina. Three research questions hypothesized these relationships: (a) Group 2 versus Group 1 assessed student achievement as theoretically based on Piaget and Vygotsky's perspectives of child development, (b) the instructional approaches related to gender, and (c) the instructional approaches interrelated to ethnicity. Using a two-factor ANOVA and ANCOVA techniques, involved a convenience sample of 126 four-year-old prekindergarten students of which a convenience sample of 126 participated. The Assessment of Measurements for Pre-K (AMP-K), pretest and posttest scores of each group of 63 students measured student achievement. The t tests determined if a significant difference in student achievement existed (dependent variable) with the Emporium Computer-Based versus hands-on instructional approaches (independent variables). The posttest scores of Group 2 (p = 0.00), indicated a significant difference in student achievement. However, gender and ethnicity variables had no effect on student achievement, male (M = 36.14, SD = 19.61) and female (M = 42.91, SD = 18.99) with (p = 0.49), and ethnicity resulted, F (1,125) = 1.65, (p = 0.20). These results suggested that further research on the Emporium Computer-Based instructional approach could improve students' intellectual abilities through more innovative practices.

  17. Hands-on Approach to Prepare Specialists in Climate Changes Modeling and Analysis Using an Information-Computational Web-GIS Portal "Climate"

    NASA Astrophysics Data System (ADS)

    Shulgina, T. M.; Gordova, Y. E.; Martynova, Y. V.

    2014-12-01

    A problem of making education relevant to the workplace tasks is a key problem of higher education in the professional field of environmental sciences. To answer this challenge several new courses for students of "Climatology" and "Meteorology" specialties were developed and implemented at the Tomsk State University, which comprises theoretical knowledge from up-to-date environmental sciences with computational tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational web GIS platform "Climate" (http://climate.scert.ru/). The platform has a set of tools and data bases allowing a researcher to perform climate changes analysis on the selected territory. The tools are also used for students' trainings, which contain practical tasks on climate modeling and climate changes assessment and analysis. Laboratory exercises are covering three topics: "Analysis of regional climate changes"; "Analysis of climate extreme indices on the regional scale"; and "Analysis of future climate". They designed to consolidate students' knowledge of discipline, to instill in them the skills to work independently with large amounts of geophysical data using modern processing and analysis tools of web-GIS platform "Climate" and to train them to present results obtained on laboratory work as reports with the statement of the problem, the results of calculations and logically justified conclusion. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern information and communication tools. Financial support for this research from the RFBR (13-05-12034, 14-05-00502), SB RAS project VIII.80.2.1 and grant of the President of RF (№ 181) is acknowledged.

  18. Detection and Characterisation of Meteors as a Big Data Citizen Science project

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.

    2017-12-01

    Out of a total around 50,000 meteorites currently known to science, the atmospheric passage was recorded instrumentally in only 30 cases with the potential to derive their atmospheric trajectories and pre-impact heliocentric orbits. Similarly, while the observations of meteors, add thousands of new entries per month to existing databases, it is extremely rare they lead to meteorite recovery. Meteor studies thus represent an excellent example of the Big Data citizen science project, where progress in the field largely depends on the prompt identification and characterisation of meteor events as well as on extensive and valuable contributions by amateur observers. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established EU COST BigSkyEarth http://bigskyearth.eu/ network.

  19. Virtual deposition plant

    NASA Astrophysics Data System (ADS)

    Tikhonravov, Alexander

    2005-09-01

    A general structure of the software for computational manufacturing experiments is discussed. It is shown that computational experiments can be useful for checking feasibility properties of theoretical designs and for finding the most practical theoretical design for a given production environment.

  20. The Health Needs of Young Women: Applying a Feminist Philosophical Lens to Nursing Science and Practice.

    PubMed

    Burton, Candace W

    2016-01-01

    Ongoing development of nursing science requires attention to the philosophical and theoretical bases upon which the science is built. A feminist theoretical perspective offers a useful lens for understanding the needs of both nurses and their clients. Adolescent and young adult women are an underserved and understudied population for whom nursing care can be especially beneficial. Considering the needs of this population from a philosophical perspective, through a feminist lens, is one effective means of developing nursing science approaches that contribute to and ultimately improve care for adolescent and young adult women.

  1. The Health Needs of Young Women: Applying a feminist philosophical lens to nursing science and practice

    PubMed Central

    Burton, Candace W.

    2016-01-01

    Ongoing development of nursing science requires attention to the philosophical and theoretical basis upon which the science is built. A feminist theoretical perspective offers a useful lens for understanding the needs of both nurses and their clients. Adolescent and young adult women are an underserved and understudied population for whom nursing care can be especially beneficial. Considering the needs of this population from a philosophical perspective, through a feminist lens, is one effective means of developing nursing science approaches that contribute to and ultimately improve care for adolescent and young adult women. PMID:27149225

  2. Introduction to bioinformatics.

    PubMed

    Can, Tolga

    2014-01-01

    Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.

  3. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.

  4. Accelerating Science with Generative Adversarial Networks: An Application to 3D Particle Showers in Multilayer Calorimeters

    NASA Astrophysics Data System (ADS)

    Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin

    2018-01-01

    Physicists at the Large Hadron Collider (LHC) rely on detailed simulations of particle collisions to build expectations of what experimental data may look like under different theoretical modeling assumptions. Petabytes of simulated data are needed to develop analysis techniques, though they are expensive to generate using existing algorithms and computing resources. The modeling of detectors and the precise description of particle cascades as they interact with the material in the calorimeter are the most computationally demanding steps in the simulation pipeline. We therefore introduce a deep neural network-based generative model to enable high-fidelity, fast, electromagnetic calorimeter simulation. There are still challenges for achieving precision across the entire phase space, but our current solution can reproduce a variety of particle shower properties while achieving speedup factors of up to 100 000 × . This opens the door to a new era of fast simulation that could save significant computing time and disk space, while extending the reach of physics searches and precision measurements at the LHC and beyond.

  5. AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields

    NASA Astrophysics Data System (ADS)

    López, R.; San-Juan, J. F.

    2013-05-01

    Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.

  6. Multilayer modeling and analysis of human brain networks

    PubMed Central

    2017-01-01

    Abstract Understanding how the human brain is structured, and how its architecture is related to function, is of paramount importance for a variety of applications, including but not limited to new ways to prevent, deal with, and cure brain diseases, such as Alzheimer’s or Parkinson’s, and psychiatric disorders, such as schizophrenia. The recent advances in structural and functional neuroimaging, together with the increasing attitude toward interdisciplinary approaches involving computer science, mathematics, and physics, are fostering interesting results from computational neuroscience that are quite often based on the analysis of complex network representation of the human brain. In recent years, this representation experienced a theoretical and computational revolution that is breaching neuroscience, allowing us to cope with the increasing complexity of the human brain across multiple scales and in multiple dimensions and to model structural and functional connectivity from new perspectives, often combined with each other. In this work, we will review the main achievements obtained from interdisciplinary research based on magnetic resonance imaging and establish de facto, the birth of multilayer network analysis and modeling of the human brain. PMID:28327916

  7. Acoustic backscatter models of fish: Gradual or punctuated evolution

    NASA Astrophysics Data System (ADS)

    Horne, John K.

    2004-05-01

    Sound-scattering characteristics of aquatic organisms are routinely investigated using theoretical and numerical models. Development of the inverse approach by van Holliday and colleagues in the 1970s catalyzed the development and validation of backscatter models for fish and zooplankton. As the understanding of biological scattering properties increased, so did the number and computational sophistication of backscatter models. The complexity of data used to represent modeled organisms has also evolved in parallel to model development. Simple geometric shapes representing body components or the whole organism have been replaced by anatomically accurate representations derived from imaging sensors such as computer-aided tomography (CAT) scans. In contrast, Medwin and Clay (1998) recommend that fish and zooplankton should be described by simple theories and models, without acoustically superfluous extensions. Since van Holliday's early work, how has data and computational complexity influenced accuracy and precision of model predictions? How has the understanding of aquatic organism scattering properties increased? Significant steps in the history of model development will be identified and changes in model results will be characterized and compared. [Work supported by ONR and the Alaska Fisheries Science Center.

  8. Assessing the Progress of Trapped-Ion Processors Towards Fault-Tolerant Quantum Computation

    NASA Astrophysics Data System (ADS)

    Bermudez, A.; Xu, X.; Nigmatullin, R.; O'Gorman, J.; Negnevitsky, V.; Schindler, P.; Monz, T.; Poschinger, U. G.; Hempel, C.; Home, J.; Schmidt-Kaler, F.; Biercuk, M.; Blatt, R.; Benjamin, S.; Müller, M.

    2017-10-01

    A quantitative assessment of the progress of small prototype quantum processors towards fault-tolerant quantum computation is a problem of current interest in experimental and theoretical quantum information science. We introduce a necessary and fair criterion for quantum error correction (QEC), which must be achieved in the development of these quantum processors before their sizes are sufficiently big to consider the well-known QEC threshold. We apply this criterion to benchmark the ongoing effort in implementing QEC with topological color codes using trapped-ion quantum processors and, more importantly, to guide the future hardware developments that will be required in order to demonstrate beneficial QEC with small topological quantum codes. In doing so, we present a thorough description of a realistic trapped-ion toolbox for QEC and a physically motivated error model that goes beyond standard simplifications in the QEC literature. We focus on laser-based quantum gates realized in two-species trapped-ion crystals in high-optical aperture segmented traps. Our large-scale numerical analysis shows that, with the foreseen technological improvements described here, this platform is a very promising candidate for fault-tolerant quantum computation.

  9. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  10. Girls Save the World through Computer Science

    ERIC Educational Resources Information Center

    Murakami, Christine

    2011-01-01

    It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…

  11. The Assessment of Taiwanese College Students' Conceptions of and Approaches to Learning Computer Science and Their Relationships

    ERIC Educational Resources Information Center

    Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2015-01-01

    The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…

  12. Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study

    ERIC Educational Resources Information Center

    Herling, Lourdes

    2011-01-01

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…

  13. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  14. Non-Determinism: An Abstract Concept in Computer Science Studies

    ERIC Educational Resources Information Center

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  15. An Investigation of Primary School Science Teachers' Use of Computer Applications

    ERIC Educational Resources Information Center

    Ocak, Mehmet Akif; Akdemir, Omur

    2008-01-01

    This study investigated the level and frequency of science teachers' use of computer applications as an instructional tool in the classroom. The manner and frequency of science teachers' use of computer, their perceptions about integration of computer applications, and other factors contributed to changes in their computer literacy are…

  16. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    ERIC Educational Resources Information Center

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  17. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  18. Design and validation of a standards-based science teacher efficacy instrument

    NASA Astrophysics Data System (ADS)

    Kerr, Patricia Reda

    National standards for K--12 science education address all aspects of science education, with their main emphasis on curriculum---both science subject matter and the process involved in doing science. Standards for science teacher education programs have been developing along a parallel plane, as is self-efficacy research involving classroom teachers. Generally, studies about efficacy have been dichotomous---basing the theoretical underpinnings on the work of either Rotter's Locus of Control theory or on Bandura's explanations of efficacy beliefs and outcome expectancy. This study brings all three threads together---K--12 science standards, teacher education standards, and efficacy beliefs---in an instrument designed to measure science teacher efficacy with items based on identified critical attributes of standards-based science teaching and learning. Based on Bandura's explanation of efficacy being task-specific and having outcome expectancy, a developmental, systematic progression from standards-based strategies and activities to tasks to critical attributes was used to craft items for a standards-based science teacher efficacy instrument. Demographic questions related to school characteristics, teacher characteristics, preservice background, science teaching experience, and post-certification professional development were included in the instrument. The instrument was completed by 102 middle level science teachers, with complete data for 87 teachers. A principal components analysis of the science teachers' responses to the instrument resulted in two components: Standards-Based Science Teacher Efficacy: Beliefs About Teaching (BAT, reliability = .92) and Standards-Based Science Teacher Efficacy: Beliefs About Student Achievement (BASA, reliability = .82). Variables that were characteristic of professional development activities, science content preparation, and school environment were identified as members of the sets of variables predicting the BAT and BASA subscales. Correlations were computed for BAT, BASA, and demographic variables to identify relationships between teacher efficacy, teacher characteristics, and school characteristics. Further research is recommended to refine the instrument and apply its use to a larger sample of science teachers. Its further development also has significance for the enhancement of science teacher education programs.

  19. PREFACE: 2nd International Conference on Mathematical Modeling in Physical Sciences 2013 (IC-MSQUARE 2013)

    NASA Astrophysics Data System (ADS)

    2014-03-01

    The second International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place at Prague, Czech Republic, from Sunday 1 September to Thursday 5 September 2013. The Conference was attended by more than 280 participants and hosted about 400 oral, poster, and virtual presentations while counted more than 600 pre-registered authors. The second IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, three parallel sessions were running every day. However, according to all attendees, the program was excellent with high level of talks and the scientific environment was fruitful, thus all attendees had a creative time. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee. Further information on the editors, speakers and committees is available in the attached pdf.

  20. Kenny Gruchalla | NREL

    Science.gov Websites

    feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology

  1. How to integrate biological research into society and exclude errors in biomedical publications? Progress in theoretical and systems biology releases pressure on experimental research.

    PubMed

    Volkov, Vadim

    2014-01-01

    This brief opinion proposes measures to increase efficiency and exclude errors in biomedical research under the existing dynamic situation. Rapid changes in biology began with the description of the three dimensional structure of DNA 60 years ago; today biology has progressed by interacting with computer science and nanoscience together with the introduction of robotic stations for the acquisition of large-scale arrays of data. These changes have had an increasing influence on the entire research and scientific community. Future advance demands short-term measures to ensure error-proof and efficient development. They can include the fast publishing of negative results, publishing detailed methodical papers and excluding a strict connection between career progression and publication activity, especially for younger researchers. Further development of theoretical and systems biology together with the use of multiple experimental methods for biological experiments could also be helpful in the context of years and decades. With regards to the links between science and society, it is reasonable to compare both these systems, to find and describe specific features for biology and to integrate it into the existing stream of social life and financial fluxes. It will increase the level of scientific research and have mutual positive effects for both biology and society. Several examples are given for further discussion.

  2. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  3. Library and Information Professionals as Knowledge Engagement Specialists. Theories, Competencies and Current Educational Possibilities in Accredited Graduate Programmes

    ERIC Educational Resources Information Center

    Prado, Javier Calzada; Marzal, Miguel Angel

    2013-01-01

    Introduction: The role of library and information science professionals as knowledge facilitators is solidly grounded in the profession's theoretical foundations as much as connected with its social relevance. Knowledge science is presented in this paper as a convenient theoretical framework for this mission, and knowledge engagement…

  4. Josh Frieman elected to the American Academy of Arts and Sciences | News

    Science.gov Websites

    Academy of Arts and Sciences April 20, 2016 icon icon icon Josh Frieman, director of the Dark Energy Survey and member of the Fermilab Theoretical Astrophysics Group. Josh Frieman, director of the Dark Dark Energy Survey and a member of the Fermilab Theoretical Astrophysics Group, was elected to the

  5. Sociocultural Influences on Science Education: Innovation for Contemporary Times

    ERIC Educational Resources Information Center

    Carter, Lyn

    2008-01-01

    This paper reviews the significant sociocultural literatures on science studies, cultural diversity, and sustainability science to develop theoretical perspectives for science education more suitable to the challenges of contemporaneity. While the influences of science studies and cultural diversity are not uncommon within the science education…

  6. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  7. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  8. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    ERIC Educational Resources Information Center

    Lewis, Colleen Marie

    2012-01-01

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…

  9. Game-Based Learning in Science Education: A Review of Relevant Research

    NASA Astrophysics Data System (ADS)

    Li, Ming-Chaun; Tsai, Chin-Chung

    2013-12-01

    The purpose of this study is to review empirical research articles regarding game-based science learning (GBSL) published from 2000 to 2011. Thirty-one articles were identified through the Web of Science and SCOPUS databases. A qualitative content analysis technique was adopted to analyze the research purposes and designs, game design and implementation, theoretical backgrounds and learning foci of these reviewed studies. The theories and models employed by these studies were classified into four theoretical foundations including cognitivism, constructivism, the socio-cultural perspective, and enactivism. The results indicate that cognitivism and constructivism were the major theoretical foundations employed by the GBSL researchers and that the socio-cultural perspective and enactivism are two emerging theoretical paradigms that have started to draw attention from GBSL researchers in recent years. The analysis of the learning foci showed that most of the digital games were utilized to promote scientific knowledge/concept learning, while less than one-third were implemented to facilitate the students' problem-solving skills. Only a few studies explored the GBSL outcomes from the aspects of scientific processes, affect, engagement, and socio-contextual learning. Suggestions are made to extend the current GBSL research to address the affective and socio-contextual aspects of science learning. The roles of digital games as tutor, tool, and tutee for science education are discussed, while the potentials of digital games to bridge science learning between real and virtual worlds, to promote collaborative problem-solving, to provide affective learning environments, and to facilitate science learning for younger students are also addressed.

  10. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  11. Once upon anion: a tale of photodetachment.

    PubMed

    Lineberger, W Carl

    2013-01-01

    This contribution is very much a personal history of a journey through the wonderful world of anion chemistry, and a tale of how advances in laser technologies, theoretical methods, and computational capabilities continuously enabled advances in our understanding. It is a story of the excitement and joy that come from the opportunity to add to the fabric of science, and to do so by working as a group of excited explorers with common goals. The participants in this journey include me, my students and postdoctoral associates, my collaborators, and our many generous colleagues. It all happened, in the words of the Beatles, "with a little help from my friends." Actually, it was so much more than a little help!

  12. Cellular Automata

    NASA Astrophysics Data System (ADS)

    Gutowitz, Howard

    1991-08-01

    Cellular automata, dynamic systems in which space and time are discrete, are yielding interesting applications in both the physical and natural sciences. The thirty four contributions in this book cover many aspects of contemporary studies on cellular automata and include reviews, research reports, and guides to recent literature and available software. Chapters cover mathematical analysis, the structure of the space of cellular automata, learning rules with specified properties: cellular automata in biology, physics, chemistry, and computation theory; and generalizations of cellular automata in neural nets, Boolean nets, and coupled map lattices. Current work on cellular automata may be viewed as revolving around two central and closely related problems: the forward problem and the inverse problem. The forward problem concerns the description of properties of given cellular automata. Properties considered include reversibility, invariants, criticality, fractal dimension, and computational power. The role of cellular automata in computation theory is seen as a particularly exciting venue for exploring parallel computers as theoretical and practical tools in mathematical physics. The inverse problem, an area of study gaining prominence particularly in the natural sciences, involves designing rules that possess specified properties or perform specified task. A long-term goal is to develop a set of techniques that can find a rule or set of rules that can reproduce quantitative observations of a physical system. Studies of the inverse problem take up the organization and structure of the set of automata, in particular the parameterization of the space of cellular automata. Optimization and learning techniques, like the genetic algorithm and adaptive stochastic cellular automata are applied to find cellular automaton rules that model such physical phenomena as crystal growth or perform such adaptive-learning tasks as balancing an inverted pole. Howard Gutowitz is Collaborateur in the Service de Physique du Solide et Résonance Magnetique, Commissariat a I'Energie Atomique, Saclay, France.

  13. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  14. A Cognitive Model for Problem Solving in Computer Science

    ERIC Educational Resources Information Center

    Parham, Jennifer R.

    2009-01-01

    According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…

  15. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  16. Defining Computational Thinking for Mathematics and Science Classrooms

    ERIC Educational Resources Information Center

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-01-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…

  17. Neuroengineering control and regulation of behavior

    NASA Astrophysics Data System (ADS)

    Wróbel, A.; Radzewicz, C.; Mankiewicz, L.; Hottowy, P.; Knapska, E.; Konopka, W.; Kublik, E.; Radwańska, K.; Waleszczyk, W. J.; Wójcik, D. K.

    2014-11-01

    To monitor neuronal circuits involved in emotional modulation of sensory processing we proposed a plan to establish novel research techniques combining recent biological, technical and analytical discoveries. The project was granted by National Science Center and we started to build a new experimental model for studying the selected circuits of genetically marked and behaviorally activated neurons. To achieve this goal we will combine the pioneering, interdisciplinary expertise of four Polish institutions: (i) the Nencki Institute of Experimental Biology (Polish Academy of Sciences) will deliver the expertise on genetically modified mice and rats, mapping of the neuronal circuits activated by behavior, monitoring complex behaviors measured in the IntelliCage system, electrophysiological brain activity recordings by multielectrodes in behaving animals, analysis and modeling of behavioral and electrophysiological data; (ii) the AGH University of Science and Technology (Faculty of Physics and Applied Computer Sciences) will use its experience in high-throughput electronics to build multichannel systems for recording the brain activity of behaving animals; (iii) the University of Warsaw (Faculty of Physics) and (iv) the Center for Theoretical Physics (Polish Academy of Sciences) will construct optoelectronic device for remote control of opto-animals produced in the Nencki Institute based on the unique experience in laser sources, studies of light propagation and its interaction with condensed media, wireless medical robotic systems, fast readout opto-electronics with control software and micromechanics.

  18. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  19. Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.

  20. Computers in Science: Thinking Outside the Discipline.

    ERIC Educational Resources Information Center

    Hamilton, Todd M.

    2003-01-01

    Describes the Computers in Science course which integrates computer-related techniques into the science disciplines of chemistry, physics, biology, and Earth science. Uses a team teaching approach and teaches students how to solve chemistry problems with spreadsheets, identify minerals with X-rays, and chemical and force analysis. (Contains 14…

  1. 78 FR 64255 - Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting SUMMARY: As a result of the impact of the recent government shutdown, the... Committee for Computer and Information Science and Engineering meeting. The public notice for this committee...

  2. Exemplary Science Teachers' Use of Technology

    ERIC Educational Resources Information Center

    Hakverdi-Can, Meral; Dana, Thomas M.

    2012-01-01

    The purpose of this study is to examine exemplary science teachers' level of computer use, their knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, how often they required their students to use those applications in or for their science class…

  3. Application of theoretical methods to increase succinate production in engineered strains.

    PubMed

    Valderrama-Gomez, M A; Kreitmayer, D; Wolf, S; Marin-Sanguino, A; Kremling, A

    2017-04-01

    Computational methods have enabled the discovery of non-intuitive strategies to enhance the production of a variety of target molecules. In the case of succinate production, reviews covering the topic have not yet analyzed the impact and future potential that such methods may have. In this work, we review the application of computational methods to the production of succinic acid. We found that while a total of 26 theoretical studies were published between 2002 and 2016, only 10 studies reported the successful experimental implementation of any kind of theoretical knowledge. None of the experimental studies reported an exact application of the computational predictions. However, the combination of computational analysis with complementary strategies, such as directed evolution and comparative genome analysis, serves as a proof of concept and demonstrates that successful metabolic engineering can be guided by rational computational methods.

  4. Analysis and logical modeling of biological signaling transduction networks

    NASA Astrophysics Data System (ADS)

    Sun, Zhongyao

    The study of network theory and its application span across a multitude of seemingly disparate fields of science and technology: computer science, biology, social science, linguistics, etc. It is the intrinsic similarities embedded in the entities and the way they interact with one another in these systems that link them together. In this dissertation, I present from both the aspect of theoretical analysis and the aspect of application three projects, which primarily focus on signal transduction networks in biology. In these projects, I assembled a network model through extensively perusing literature, performed model-based simulations and validation, analyzed network topology, and proposed a novel network measure. The application of network modeling to the system of stomatal opening in plants revealed a fundamental question about the process that has been left unanswered in decades. The novel measure of the redundancy of signal transduction networks with Boolean dynamics by calculating its maximum node-independent elementary signaling mode set accurately predicts the effect of single node knockout in such signaling processes. The three projects as an organic whole advance the understanding of a real system as well as the behavior of such network models, giving me an opportunity to take a glimpse at the dazzling facets of the immense world of network science.

  5. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…

  6. Computational manufacturing as a bridge between design and production.

    PubMed

    Tikhonravov, Alexander V; Trubetskov, Michael K

    2005-11-10

    Computational manufacturing of optical coatings is a research area that can be placed between theoretical designing and practical manufacturing in the same way that computational physics can be placed between theoretical and experimental physics. Investigations in this area have been performed for more than 30 years under the name of computer simulation of manufacturing and monitoring processes. Our goal is to attract attention to the increasing importance of computational manufacturing at the current state of the art in the design and manufacture of optical coatings and to demonstrate possible applications of this research tool.

  7. Computational manufacturing as a bridge between design and production

    NASA Astrophysics Data System (ADS)

    Tikhonravov, Alexander V.; Trubetskov, Michael K.

    2005-11-01

    Computational manufacturing of optical coatings is a research area that can be placed between theoretical designing and practical manufacturing in the same way that computational physics can be placed between theoretical and experimental physics. Investigations in this area have been performed for more than 30 years under the name of computer simulation of manufacturing and monitoring processes. Our goal is to attract attention to the increasing importance of computational manufacturing at the current state of the art in the design and manufacture of optical coatings and to demonstrate possible applications of this research tool.

  8. PREFACE: Proceedings of the First International Workshop on the Theoretical Calculation of ELNES and XANES (TEX2008) (Nagoya, Japan, 2-4 July 2008) Proceedings of the First International Workshop on the Theoretical Calculation of ELNES and XANES (TEX2008) (Nagoya, Japan, 2-4 July 2008)

    NASA Astrophysics Data System (ADS)

    Tanaka, Isao; Mizoguchi, Teruyasu; Yamamoto, Tomoyuki

    2009-03-01

    Both electron energy loss near edge structure (ELNES) spectroscopy and x-ray absorption near edge structure (XANES) spectroscopy provide information on the local structural and chemical environments of selected elements of interest. Recent technological progress in scanning transmission electron microscopy has enabled ELNES measurements with atomic column spatial resolution. Very dilute concentrations (nanograms per milliliter or ppb level) of dopants can be observed using third-generation synchrotron facilities when x-ray fluorescence is measured with highly efficient detectors. With such technical developments, ELNES and XANES have become established as essential tools in a large number of fields of natural science, including condensed matter physics, chemistry, mineralogy and materials science. In addition to these developments in experimental methodology, notable progress in reproducing spectra using theoretical methods has recently been made. Using first-principles methods, one can analyze and interpret spectra without reference to experiment. This is quite important since we are often interested in the analysis of exotic materials or specific atoms located at lattice discontinuities such as surfaces and interfaces, where appropriate experimental data are difficult to obtain. Using the structures predicted by reliable first-principles calculations, one can calculate theoretical ELNES and XANES spectra without too much difficulty even in such cases. Despite the fact that ELNES and XANES probe the same phenomenon—essentially the electric dipole transition from a core orbital to an unoccupied band—there have not been many opportunities for researchers in the two areas to meet and discuss. Theoretical calculations of ELNES spectra have been mainly confined to the electron microscopy community. On the other hand, the theory of XANES has been developed principally by researchers in the x-ray community. Publications describing the methods have been written more-or-less independently by the two communities. The three-day workshop on the Theoretical Calculation of ELNES and XANES (TEX2008) was planned to help remedy this situation. It aimed to demonstrate capability of state-of-the-art theoretical techniques to explain and predict ELNES and XANES spectra, and to allow deep discussion between scientists in the two communities. It also provided an excellent opportunity to introduce experimentalists to the computational techniques available. Invited talks and poster presentations by leading scientists were given on the first day, which was followed by tutorial sessions for five computer programs on the second and third days. Excellent lectures were given by Peter Blaha (Vienna, Austria) on the WIEN2k code, Chris J Pickard (St Andrews, UK) on the CASTEP code, John J Rehr (Seattle, USA) on the FEFF8 code, Frank de Groot (Utrecht, The Netherlands) on the CTM4XAS code, and Hidekazu Ikeno (Kyoto, Japan) on the first-principles CI-multiplet code. Thanks to the enthusiastic participation of more than 100 scientists from around the world, the workshop was a complete success. The aim of this special issue in Journal of Physics: Condensed Matter is to share with the readers the most up-to-date knowledge presented at the workshop. We believe this will prove useful as a reference for researchers in many different fields, as well as an overview of the current status and future directions of theoretical calculations for ELNES and XANES. TEX2008 was a satellite meeting of the First International Symposium on Advanced Microscopy and Theoretical Calculations (AMTC1) (Nagoya, Japan, 29-30 June 2008), which was held in commemoration of the establishment of the Nanostuctures Research Laboratory (NSRL) at the Japan Fine Ceramics Center (JFCC) and as a daughter event of EXPO 2005, Aichi, Japan. A Grant-in-Aid for Scientific Research on Priority Areas 'Nano Materials Science for Atomic-Scale Modification' from the Ministry of Education, Culture, Sports and Technology (MEXT) and support from the Chubu Economic Federation for the workshop are gratefully acknowledged.

  9. An Overview of NASA's Intelligent Systems Program

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.

  10. A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.

    ERIC Educational Resources Information Center

    Deek, Fadi P.; Kimmel, Howard

    2002-01-01

    Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)

  11. A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…

  12. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  13. ASCR Workshop on Quantum Computing for Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less

  14. Book Review: Book review

    NASA Astrophysics Data System (ADS)

    Mishchenko, Michael I.

    2017-01-01

    The second - revised and enlarged - edition of this popular monograph is co-authored by Michael Kahnert and is published as Volume 145 of the Springer Series in Optical Sciences. As in the first edition, the main emphasis is on the mathematics of electromagnetic scattering and on numerically exact computer solutions of the frequency-domain macroscopic Maxwell equations for particles with complex shapes. The book is largely centered on Green-function solution of relevant boundary value problems and the T-matrix methodology, although other techniques (the method of lines, integral equation methods, and Lippmann-Schwinger equations) are also covered. The first four chapters serve as a thorough overview of key theoretical aspects of electromagnetic scattering intelligible to readers with undergraduate training in mathematics. A separate chapter provides an instructive analysis of the Rayleigh hypothesis which is still viewed by many as a highly controversial aspect of electromagnetic scattering by nonspherical objects. Another dedicated chapter introduces basic quantities serving as optical observables in practical applications. A welcome extension of the first edition is the new chapter on group theoretical aspects of electromagnetic scattering by particles with discrete symmetries. An essential part of the book is the penultimate chapter describing in detail popular public-domain computer programs mieschka and Tsym which can be applied to a wide range of particle shapes. The final chapter provides a general overview of available literature on electromagnetic scattering by particles and gives useful reading advice.

  15. Advances in theory and their application within the field of zeolite chemistry.

    PubMed

    Van Speybroeck, Veronique; Hemelsoet, Karen; Joos, Lennart; Waroquier, Michel; Bell, Robert G; Catlow, C Richard A

    2015-10-21

    Zeolites are versatile and fascinating materials which are vital for a wide range of industries, due to their unique structural and chemical properties, which are the basis of applications in gas separation, ion exchange and catalysis. Given their economic impact, there is a powerful incentive for smart design of new materials with enhanced functionalities to obtain the best material for a given application. Over the last decades, theoretical modeling has matured to a level that model guided design has become within reach. Major hurdles have been overcome to reach this point and almost all contemporary methods in computational materials chemistry are actively used in the field of modeling zeolite chemistry and applications. Integration of complementary modeling approaches is necessary to obtain reliable predictions and rationalizations from theory. A close synergy between experimentalists and theoreticians has led to a deep understanding of the complexity of the system at hand, but also allowed the identification of shortcomings in current theoretical approaches. Inspired by the importance of zeolite characterization which can now be performed at the single atom and single molecule level from experiment, computational spectroscopy has grown in importance in the last decade. In this review most of the currently available modeling tools are introduced and illustrated on the most challenging problems in zeolite science. Directions for future model developments will be given.

  16. A Case Study of the Introduction of Computer Science in NZ Schools

    ERIC Educational Resources Information Center

    Bell, Tim; Andreae, Peter; Robins, Anthony

    2014-01-01

    For many years computing in New Zealand schools was focused on teaching students how to use computers, and there was little opportunity for students to learn about programming and computer science as formal subjects. In this article we review a series of initiatives that occurred from 2007 to 2009 that led to programming and computer science being…

  17. Making the Implicit Explicit: The Grammar of Inferential Reasoning in the Humanities and Social Sciences

    ERIC Educational Resources Information Center

    Luckett, Kathy

    2016-01-01

    This is a theoretical paper that addresses the challenge of educational access to the Humanities and Social Sciences. It plots a theoretical quest to develop an explicit pedagogy to give "disadvantaged" students in the Humanities ways of working successfully with texts. In doing so it draws on Bernstein, Moore and Maton's work to…

  18. Investigation of Relationship between Theoretical Practice Course Success and Attendance

    ERIC Educational Resources Information Center

    Dalkiran, Oguzhan

    2018-01-01

    The aim of the study is to determine the relationship between the attendance of theoretical and applied field courses and the success status of the students attending Sports Science Faculty. The data of the study consisted of 68 female and 88 male students in the Faculty of Sports Sciences; two lectures and two practicals, and 624 grade points and…

  19. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  20. [Research activities in applied mathematics, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  1. Activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1985 through October 2, 1985 is summarized.

  2. Theoretical studies on sRNA-mediated regulation in bacteria

    NASA Astrophysics Data System (ADS)

    Chang, Xiao-Xue; Xu, Liu-Fang; Shi, Hua-Lin

    2015-12-01

    Small RNA(sRNA)-mediated post-transcriptional regulation differs from protein-mediated regulation. Through base-pairing, sRNA can regulate the target mRNA in a catalytic or stoichiometric manner. Some theoretical models were built for comparison of the protein-mediated and sRNA-mediated modes in the steady-state behaviors and noise properties. Many experiments demonstrated that a single sRNA can regulate several mRNAs, which causes crosstalk between the targets. Here, we focus on some models in which two target mRNAs are silenced by the same sRNA to discuss their crosstalk features. Additionally, the sequence-function relationship of sRNA and its role in the kinetic process of base-pairing have been highlighted in model building. Project supported by the National Basic Research Program of China (Grant No. 2013CB834100), the National Natural Science Foundation of China (Grant Nos. 11121403 and 11274320), the Open Project Program of State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of Sciences, China (Grant No. Y4KF171CJ1), the National Natural Science Foundation for Young Scholar of China (Grant No. 11304115), and the China Postdoctoral Science Foundation (Grant No. 2013M541282).

  3. A Quantitative Model for Assessing Visual Simulation Software Architecture

    DTIC Science & Technology

    2011-09-01

    Software Engineering Arnold Buss Research Associate Professor of MOVES LtCol Jeff Boleng, PhD Associate Professor of Computer Science U.S. Air Force Academy... science (operating and programming systems series). New York, NY, USA: Elsevier Science Ltd. Henry, S., & Kafura, D. (1984). The evaluation of software...Rudy Darken Professor of Computer Science Dissertation Supervisor Ted Lewis Professor of Computer Science Richard Riehle Professor of Practice

  4. Interactive Synthesis of Code Level Security Rules

    DTIC Science & Technology

    2017-04-01

    Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University

  5. Approaching gender parity: Women in computer science at Afghanistan's Kabul University

    NASA Astrophysics Data System (ADS)

    Plane, Jandelyn

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in Afghanistan, they appear to hinder advancement to degree to a lesser extent. Women comprise at least 36% of each graduating class from KU's Computer Science Department; however, in 2007 women were 25% of the university population. In the US, women comprise over 50% of university populations while only graduating on average 25% women in undergraduate computer science programs. Representation of women in computer science in the US is 50% below the university rate, but at KU, it is 50% above the university rate. This mixed methods study of KU was conducted in the following three stages: setting up focus groups with women computer science students, distributing surveys to all students in the CS department, and conducting a series of 22 individual interviews with fourth year CS students. The analysis of the data collected and its comparison to literature on university/department retention in Science, Technology, Engineering and Mathematics gender representation and on women's education in underdeveloped Islamic countries illuminates KU's uncharacteristic representation of women in its Computer Science Department. The retention of women in STEM through the education pipeline has several characteristics in Afghanistan that differ from countries often studied in available literature. Few Afghan students have computers in their home and few have training beyond secretarial applications before considering studying CS at university. University students in Afghanistan are selected based on placement exams and are then assigned to an area of study, and financially supported throughout their academic career, resulting in a low attrition rate from the program. Gender and STEM literature identifies parental encouragement, stereotypes and employment perceptions as influential characteristics. Afghan women in computer science received significant parental encouragement even from parents with no computer background. They do not seem to be influenced by any negative "geek" stereotypes, but they do perceive limitations when considering employment after graduation.

  6. Science-Technology Coupling: The Case of Mathematical Logic and Computer Science.

    ERIC Educational Resources Information Center

    Wagner-Dobler, Roland

    1997-01-01

    In the history of science, there have often been periods of sudden rapprochements between pure science and technology-oriented branches of science. Mathematical logic as pure science and computer science as technology-oriented science have experienced such a rapprochement, which is studied in this article in a bibliometric manner. (Author)

  7. Cognitive computing and eScience in health and life science research: artificial intelligence and obesity intervention programs.

    PubMed

    Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna

    2017-12-01

    To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.

  8. 78 FR 61870 - Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-04

    ... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting In accordance with Federal Advisory Committee Act (Pub. L. 92-463, as amended... Committee for Computer and Information Science and Engineering (1115). Date/Time: Oct 31, 2013: 12:30 p.m...

  9. The Six Core Theories of Modern Physics

    NASA Astrophysics Data System (ADS)

    Stevens, Charles F.

    1996-09-01

    Charles Stevens, a prominent neurobiologist who originally trained as a biophysicist (with George Uhlenbeck and Mark Kac), wrote this book almost by accident. Each summer he found himself reviewing key areas of physics that he had once known and understood well, for use in his present biological research. Since there was no book, he created his own set of notes, which formed the basis for this brief, clear, and self-contained summary of the basic theoretical structures of classical mechanics, electricity and magnetism, quantum mechanics, statistical physics, special relativity, and quantum field theory. The Six Core Theories of Modern Physics can be used by advanced undergraduates or beginning graduate students as a supplement to the standard texts or for an uncluttered, succinct review of the key areas. Professionals in such quantitative sciences as chemistry, engineering, computer science, applied mathematics, and biophysics who need to brush up on the essentials of a particular area will find most of the required background material, including the mathematics.

  10. Determination of neutron capture cross sections of 232Th at 14.1 MeV and 14.8 MeV using the neutron activation method

    NASA Astrophysics Data System (ADS)

    Lan, Chang-Lin; Zhang, Yi; Lv, Tao; Xie, Bao-Lin; Peng, Meng; Yao, Ze-En; Chen, Jin-Gen; Kong, Xiang-Zhong

    2017-04-01

    The 232Th(n, γ)233Th neutron capture reaction cross sections were measured at average neutron energies of 14.1 MeV and 14.8 MeV using the activation method. The neutron flux was determined using the monitor reaction 27Al(n,α)24Na. The induced gamma-ray activities were measured using a low background gamma ray spectrometer equipped with a high resolution HPGe detector. The experimentally determined cross sections were compared with the data in the literature, and the evaluated data of ENDF/B-VII.1, JENDL-4.0u+, and CENDL-3.1. The excitation functions of the 232Th(n,γ)233Th reaction were also calculated theoretically using the TALYS1.6 computer code. Supported by Chinese TMSR Strategic Pioneer Science and Technology Project-The Th-U Fuel Physics Term (XDA02010100) and National Natural Science Foundation of China (11205076, 21327801)

  11. Activities of the Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1984 through March 31, 1985 is summarized.

  12. [Research Conducted at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1996 - 31 Mar. 1997.

  13. Activities of the Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 2, 1987 through March 31, 1988.

  14. [Activities of Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics. fluid mechanics, and computer science during the period April 1, 1999 through September 30. 1999.

  15. Practical Measurement of Complexity In Dynamic Systems

    DTIC Science & Technology

    2012-01-01

    policies that produce highly complex behaviors , yet yield no benefit. 21Jason B. Clark and David R. Jacques / Procedia Computer Science 8 (2012) 14... Procedia Computer Science 8 (2012) 14 – 21 1877-0509 © 2012 Published by Elsevier B.V. doi:10.1016/j.procs.2012.01.008 Available online at...www.sciencedirect.com Procedia Computer Science Procedia Computer Science 00 (2012) 000–000 www.elsevier.com/locate/ procedia Available online at

  16. The role of physicality in rich programming environments

    NASA Astrophysics Data System (ADS)

    Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin

    2013-12-01

    Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot Virtual Worlds (RVWs), can be used to teach computer science principles within a robotics context by examining its use in high-school classrooms. We also investigated whether the lack of physicality in these environments impacts student learning by comparing classrooms that used either virtual or physical robots for the RVW curriculum. Results suggest that the RVW environment leads to significant gains in computer science knowledge, that virtual robots lead to faster learning, and that physical robots may have some influence on algorithmic thinking. We discuss the implications of physicality in these programming environments for learning computer science.

  17. Meta-Theoretical Contributions to the Constitution of a Model-Based Didactics of Science

    ERIC Educational Resources Information Center

    Ariza, Yefrin; Lorenzano, Pablo; Adúriz-Bravo, Agustín

    2016-01-01

    There is nowadays consensus in the community of didactics of science (i.e. science education understood as an academic discipline) regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have…

  18. Path Not Found: Disparities in Access to Computer Science Courses in California High Schools

    ERIC Educational Resources Information Center

    Martin, Alexis; McAlear, Frieda; Scott, Allison

    2015-01-01

    "Path Not Found: Disparities in Access to Computer Science Courses in California High Schools" exposes one of the foundational causes of underrepresentation in computing: disparities in access to computer science courses in California's public high schools. This report provides new, detailed data on these disparities by student body…

  19. Developing Oral and Written Communication Skills in Undergraduate Computer Science and Information Systems Curriculum

    ERIC Educational Resources Information Center

    Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung

    2010-01-01

    Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…

  20. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  1. Urban science classrooms and new possibilities: on intersubjectivity and grammar in the third space

    NASA Astrophysics Data System (ADS)

    Emdin, Christopher

    2009-03-01

    In this article I explore research in urban science education inspired by the work of Kris Gutierrez in a paper based on her 2005 Scribner Award. It addresses key points in Gutierrez's work by exploring theoretical frameworks for research and approaches to teaching and research that expand the discourse on the agency of urban youth in corporate school settings. The work serves as an overview of under-discussed approaches and theoretical frameworks to consider in teaching and conducting research with marginalized urban youth in urban science classrooms.

  2. Perspectives on Policy and the Value of Nursing Science in a Big Data Era.

    PubMed

    Gephart, Sheila M; Davis, Mary; Shea, Kimberly

    2018-01-01

    As data volume explodes, nurse scientists grapple with ways to adapt to the big data movement without jeopardizing its epistemic values and theoretical focus that celebrate while acknowledging the authority and unity of its body of knowledge. In this article, the authors describe big data and emphasize ways that nursing science brings value to its study. Collective nursing voices that call for more nursing engagement in the big data era are answered with ways to adapt and integrate theoretical and domain expertise from nursing into data science.

  3. Quantum-Theoretical Methods and Studies Relating to Properties of Materials

    DTIC Science & Technology

    1989-12-19

    particularly sensitive to the behavior of the electron distribution close to the nuclei, which contributes only to E(l). Although the above results were...other condensed phases. So it was a useful test case to test the behavior of the theoretical computations for the gas phase relative to that in the...increasingly complicated and time- comsuming electron-correlation approximations should assure a small error in the theoret- ically computed enthalpy for a

  4. Women in computer science: An interpretative phenomenological analysis exploring common factors contributing to women's selection and persistence in computer science as an academic major

    NASA Astrophysics Data System (ADS)

    Thackeray, Lynn Roy

    The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.

  5. 77 FR 38630 - Open Internet Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ... Computer Science and Co-Founder of the Berkman Center for Internet and Society, Harvard University, is... of Technology Computer Science and Artificial Intelligence Laboratory, is appointed vice-chairperson... Jennifer Rexford, Professor of Computer Science, Princeton University Dennis Roberson, Vice Provost...

  6. Research in progress at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.

  7. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  8. Enabling Earth Science Through Cloud Computing

    NASA Technical Reports Server (NTRS)

    Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian

    2012-01-01

    Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.

  9. PREFACE: Proceedings of the International Workshop on Current Challenges in Liquid and Glass Science, (The Cosener's House, Abingdon 10 12 January 2007)

    NASA Astrophysics Data System (ADS)

    Hannon, Alex C.; Salmon, Philip S.; Soper, Alan K.

    2007-10-01

    The workshop was held to discuss current experimental and theoretical challenges in liquid and glass science and to honour the contribution made by Spencer Howells (ISIS, UK) to the field of neutron scattering from liquids and glasses. The meeting was attended by 70 experimentalists, theorists and computer simulators from Europe, Japan and North America and comprised 34 oral presentations together with two lively poster sessions. Three major themes were discussed, namely (i) the glass transition and properties of liquids and glasses under extreme conditions; (ii) the complementarity of neutron and x-ray scattering techniques with other experimental methods; and (iii) the modelling of liquid and glass structure. These themes served to highlight (a) recent advances in neutron and x-ray instrumentation used to investigate liquid and glassy materials under extreme conditions; (b) the relationship between the results obtained from different experimental and theoretical/computational methods; and (c) the modern methods used to interpret experimental results. The presentations ranged from polyamorphism in liquids and glasses to protein folding in aqueous solution and included the dynamics of fresh and freeze-dried strawberries and red onions. The properties of liquid phosphorus were also memorably demonstrated! The formal highlight was the 'Spencerfest' dinner where Neil Cowlam (Sheffield, UK) gave an excellent after dinner speech. The organisation of the workshop benefited tremendously from the secretarial skills of Carole Denning (ISIS, UK). The financial support of the Council for the Central Laboratory of the Research Councils (CCLRC), the Liquids and Complex Fluids Group of the Institute of Physics, The ISIS Disordered Materials Group, the CCLRC Centre for Materials Physics and Chemistry and the CCLRC Centre for Molecular Structure and Dynamics is gratefully acknowledged. Finally, it is a pleasure to thank all the workshop participants whose lively contributions led to the success of the meeting. The present special issue stems from the interest of many of those present to collect their work into a single volume.

  10. FOREWORD: International Workshop on Theoretical Plasma Physics: Modern Plasma Science. Sponsored by the Abdus Salam ICTP, Trieste, Italy

    NASA Astrophysics Data System (ADS)

    Shukla, P. K.; Stenflo, L.

    2005-01-01

    The "International Workshop on Theoretical Plasma Physics: Modern Plasma Science was held at the Abdus Salam International Centre for Theoretical Physics (Abdus Salam ICTP), Trieste, Italy during the period 5 16 July 2004. The workshop was organized by P K Shukla, R Bingham, S M Mahajan, J T Mendonça, L Stenflo, and others. The workshop enters into a series of previous biennial activities that we have held at the Abdus Salam ICTP since 1989. The scientific program of the workshop was split into two parts. In the first week, most of the lectures dealt with problems concerning astrophysical plasmas, while in the second week, diversity was introduced in order to address the important role of plasma physics in modern areas of science and technology. Here, attention was focused on cross-disciplinary topics including Schrödinger-like models, which are common in plasma physics, nonlinear optics, quantum engineering (Bose-Einstein condensates), and nonlinear fluid mechanics, as well as emerging topics in fundamental theoretical and computational plasma physics, space and dusty plasma physics, laser-plasma interactions, etc. The workshop was attended by approximately hundred-twenty participants from the developing countries, Europe, USA, and Japan. A large number of participants were young researchers from both the developing and industrial countries, as the directors of the workshop tried to keep a good balance in inviting senior and younger generations of theoretical, computational and experimental plasma physicists to our Trieste activities. In the first week, there were extensive discussions on the physics of electromagnetic wave emissions from pulsar magnetospheres, relativistic magnetohydrodynamics of astrophysical objects, different scale sizes turbulence and structures in astrophysics. The scientific program of the second week included five review talks (60 minutes) and about thirty invited topical lectures (30 minutes). In addition, during the two weeks, there were more than seventy poster papers in three sessions. The latter provided opportunities for younger physicists to display the results of their recent work and to obtain comments from the other participants. During the period 11 16 July 2004 at the Abdus Salam ICTP, we focused on nonlinear effects that are common in plasmas, fluids, nonlinear optics, and condensed matter physics. In addition, we concentrated on collective processes in space and dusty plasmas, as well as in astrophysics and intense laser-plasma interactions. Also presented were modern topics of nonlinear neutrino-plasma interactions, nonlinear quantum electrodynamics, quark-gluon plasmas, and high-energy astrophysics. This reflects that plasma physics is a truly cross-disciplinary and very fascinating science with many potential applications. The workshop was attended by several distinguished invited speakers. Most of the contributions from the second week of our Trieste workshop appear in this Topical Issue of Physica Scripta, which will be distributed to all the participants. The organizers are grateful to Professor Katepalli Raju Sreenivasan, the director of the Abdus Salam ICTP, for his generous support and warm hospitality in Trieste. The Editors appreciate their colleagues and co-organizers for their constant and wholehearted support in our endeavours of publishing this Topical Issue of Physica Scripta. We highly value the excellent work of Mrs Ave Lusenti and Dr. Brian Stewart at the Abdus Salam ICTP. Thanks are also due to the European Commission for supporting our activity through the Research Training Networks entitled "Complex Plasmas" and "Turbulent Boundary Layers". Finally, we would like to express our gratitude to the Abdus Salam ICTP for providing financial support to our workshop in Trieste. Besides, the workshop directors thank the speakers and the attendees for their contributions which resulted in the success of our Trieste workshop 2004. Specifically, we appreciate the speakers for delivering excellent talks, supplying well prepared manuscripts for publication, and enhancing the plasma physics activity at the Abdus Salam ICTP.

  11. System biology of gene regulation.

    PubMed

    Baitaluk, Michael

    2009-01-01

    A famous joke story that exhibits the traditionally awkward alliance between theory and experiment and showing the differences between experimental biologists and theoretical modelers is when a University sends a biologist, a mathematician, a physicist, and a computer scientist to a walking trip in an attempt to stimulate interdisciplinary research. During a break, they watch a cow in a field nearby and the leader of the group asks, "I wonder how one could decide on the size of a cow?" Since a cow is a biological object, the biologist responded first: "I have seen many cows in this area and know it is a big cow." The mathematician argued, "The true volume is determined by integrating the mathematical function that describes the outer surface of the cow's body." The physicist suggested: "Let's assume the cow is a sphere...." Finally the computer scientist became nervous and said that he didn't bring his computer because there is no Internet connection up there on the hill. In this humorous but explanatory story suggestions proposed by theorists can be taken to reflect the view of many experimental biologists that computer scientists and theorists are too far removed from biological reality and therefore their theories and approaches are not of much immediate usefulness. Conversely, the statement of the biologist mirrors the view of many traditional theoretical and computational scientists that biological experiments are for the most part simply descriptive, lack rigor, and that much of the resulting biological data are of questionable functional relevance. One of the goals of current biology as a multidisciplinary science is to bring people from different scientific areas together on the same "hill" and teach them to speak the same "language." In fact, of course, when presenting their data, most experimentalist biologists do provide an interpretation and explanation for the results, and many theorists/computer scientists aim to answer (or at least to fully describe) questions of biological relevance. Thus systems biology could be treated as such a socioscientific phenomenon and a new approach to both experiments and theory that is defined by the strategy of pursuing integration of complex data about the interactions in biological systems from diverse experimental sources using interdisciplinary tools and personnel.

  12. PREFACE: IC-MSQUARE 2012: International Conference on Mathematical Modelling in Physical Sciences

    NASA Astrophysics Data System (ADS)

    Kosmas, Theocharis; Vagenas, Elias; Vlachos, Dimitrios

    2013-02-01

    The first International Conference on Mathematical Modelling in Physical Sciences (IC-MSQUARE) took place in Budapest, Hungary, from Monday 3 to Friday 7 September 2012. The conference was attended by more than 130 participants, and hosted about 290 oral, poster and virtual papers by more than 460 pre-registered authors. The first IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields in which mathematical modelling is used, such as theoretical/mathematical physics, neutrino physics, non-integrable systems, dynamical systems, computational nanoscience, biological physics, computational biomechanics, complex networks, stochastic modelling, fractional statistics, DNA dynamics, and macroeconomics. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, two parallel sessions ran every day. However, according to all attendees, the program was excellent with a high level of talks and the scientific environment was fruitful; thus all attendees had a creative time. The mounting question is whether this occurred accidentally, or whether IC-MSQUARE is a necessity in the field of physical and mathematical modelling. For all of us working in the field, the existing and established conferences in this particular field suffer from two distinguished and recognized drawbacks: the first is the increasing orientation, while the second refers to the extreme specialization of the meetings. Therefore, a conference which aims to promote the knowledge and development of high-quality research in mathematical fields concerned with applications of other scientific fields as well as modern technological trends in physics, chemistry, biology, medicine, economics, sociology, environmental sciences etc., appears to be a necessity. This is the key role that IC-MSQUARE will play. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contributions to IC-MSQUARE. We would also like to thank the members of the International Scientific Committee and the members of the Organizing Committee. Conference Chairmen Theocharis Kosmas Department of Physics, University of Ioannina Elias Vagenas RCAAM, Academy of Athens Dimitrios Vlachos Department of Computer Science and Technology, University of Peloponnese The PDF also contains a list of members of the International Scientific Committes and details of the Keynote and Invited Speakers.

  13. 76 FR 61118 - Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... Engineering; Notice of Meeting In accordance with the Federal Advisory Committee Act (Pub. L. 92- 463, as... Computer and Information Science and Engineering (1115). Date and Time: November 1, 2011 from 12 p.m.-5:30... Computer and Information Science and Engineering, National Science Foundation, 4201 Wilson Blvd., Suite...

  14. Computer Science in High School Graduation Requirements. ECS Education Trends (Updated)

    ERIC Educational Resources Information Center

    Zinth, Jennifer

    2016-01-01

    Allowing high school students to fulfill a math or science high school graduation requirement via a computer science credit may encourage more student to pursue computer science coursework. This Education Trends report is an update to the original report released in April 2015 and explores state policies that allow or require districts to apply…

  15. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  16. Characteristics of the Navy Laboratory Warfare Center Technical Workforce

    DTIC Science & Technology

    2013-09-29

    Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information

  17. DNA, RNA and the Physical Basis of Life

    ERIC Educational Resources Information Center

    Fong, Peter

    1969-01-01

    Presents the application of knowledge in the physical sciences to biological science problems, including those in the behavioral sciences, social sciences, and the humanities. Examples are presented in the areas of molecular psychology and theoretical biology, besides the principal genetic discussion. (RR)

  18. Relationships between the Philosophy of Science and Didactics of Science.

    ERIC Educational Resources Information Center

    Aduriz-Bravo, Agustin; Izquierdo, Merce; Galagovsky, Lydia

    2002-01-01

    Presents a theoretical classification of relationships between the philosophy of science and didactics of science, based on the metadiscursive nature which philosophy and didactics share. Describes five different relationships between the two disciplines: material, instrumental, explanatory, rhetorical, and metatheoretical. (Author/MM)

  19. Darwin's legacy

    NASA Astrophysics Data System (ADS)

    Susskind, Leonard

    2009-07-01

    Charles Darwin was no theoretical physicist, and I am no biologist. Yet, as a theoretical physicist, I have found much to think about in Darwin's legacy - and in that of his fellow naturalist Alfred Russell Wallace. Darwin's style of science is not usually thought of as theoretical and certainly not mathematical: he was a careful observer of nature, kept copious notes, contributed to zoological collections; and eventually from his vast repertoire of observation deduced the idea of natural selection as the origin of species. The value of theorizing is often dismissed in the biological sciences as less important than observation; and Darwin was the master observer.

  20. Mono- and binuclear non-heme iron chemistry from a theoretical perspective.

    PubMed

    Rokob, Tibor András; Chalupský, Jakub; Bím, Daniel; Andrikopoulos, Prokopis C; Srnec, Martin; Rulíšek, Lubomír

    2016-09-01

    In this minireview, we provide an account of the current state-of-the-art developments in the area of mono- and binuclear non-heme enzymes (NHFe and NHFe2) and the smaller NHFe(2) synthetic models, mostly from a theoretical and computational perspective. The sheer complexity, and at the same time the beauty, of the NHFe(2) world represents a challenge for experimental as well as theoretical methods. We emphasize that the concerted progress on both theoretical and experimental side is a conditio sine qua non for future understanding, exploration and utilization of the NHFe(2) systems. After briefly discussing the current challenges and advances in the computational methodology, we review the recent spectroscopic and computational studies of NHFe(2) enzymatic and inorganic systems and highlight the correlations between various experimental data (spectroscopic, kinetic, thermodynamic, electrochemical) and computations. Throughout, we attempt to keep in mind the most fascinating and attractive phenomenon in the NHFe(2) chemistry, which is the fact that despite the strong oxidative power of many reactive intermediates, the NHFe(2) enzymes perform catalysis with high selectivity. We conclude with our personal viewpoint and hope that further developments in quantum chemistry and especially in the field of multireference wave function methods are needed to have a solid theoretical basis for the NHFe(2) studies, mostly by providing benchmarking and calibration of the computationally efficient and easy-to-use DFT methods.

  1. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    ERIC Educational Resources Information Center

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  2. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  3. Prospective Students' Reactions to the Presentation of the Computer Science Major

    ERIC Educational Resources Information Center

    Weaver, Daniel Scott

    2010-01-01

    The number of students enrolling in Computer Science in colleges and Universities has declined since its peak in the early 2000s. Some claim contributing factors that intimate that prospective students fear the lack of employment opportunities if they study computing in college. However, the lack of understanding of what Computer Science is and…

  4. PREFACE: 4th Workshop on Theory, Modelling and Computational Methods for Semiconductors (TMCSIV)

    NASA Astrophysics Data System (ADS)

    Tomić, Stanko; Probert, Matt; Migliorato, Max; Pal, Joydeep

    2014-06-01

    These conference proceedings contain the written papers of the contributions presented at the 4th International Conference on Theory, Modelling and Computational Methods for Semiconductor materials and nanostructures. The conference was held at the MediaCityUK, University of Salford, Manchester, UK on 22-24 January 2014. The previous conferences in this series took place in 2012 at the University of Leeds, in 2010 at St William's College, York and in 2008 at the University of Manchester, UK. The development of high-performance computer architectures is finally allowing the routine use of accurate methods for calculating the structural, thermodynamic, vibrational, optical and electronic properties of semiconductors and their hetero- and nano-structures. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in semiconductor science and technology, where there is substantial potential for time-saving in R&D. Theoretical approaches represented in this meeting included: Density Functional Theory, Semi-empirical Electronic Structure Methods, Multi-scale Approaches, Modelling of PV devices, Electron Transport, and Graphene. Topics included, but were not limited to: Optical Properties of Quantum Nanostructures including Colloids and Nanotubes, Plasmonics, Magnetic Semiconductors, Photonic Structures, and Electronic Devices. This workshop ran for three days, with the objective of bringing together UK and international leading experts in the theoretical modelling of Group IV, III-V and II-VI semiconductors, as well as students, postdocs and early-career researchers. The first day focused on providing an introduction and overview of this vast field, aimed particularly at students, with several lectures given by recognized experts in various theoretical approaches. The following two days showcased some of the best theoretical research carried out in the UK in this field, with several contributions also from representatives of renowned theoretical groups from many European countries (Spain, France, Ireland, Germany, Switzerland, Luxemburg, Norway, Italy, Poland, Denmark, Sweden, Serbia, etc.), as well as Asia (Iran, Japan) and USA. We would like to thank all participants for making this a very successful meeting and for their contribution to the conference programme and these proceedings. We would also like to acknowledge the financial support from the Institute of Physics (Semiconductor Physics Group and Computational Physics Group), EPSRC-UK, the CECAM UK-Hartree Node, CCP9, and Quantum Wise (distributors of Atomistix). The Editors Acknowledgments Conference Organising Committee: Stanko Tomić (Chair, University of Salford) Matt Probert (University of York) Max Migliorato (University of Manchester) Joydeep Pal (University of Manchester) Programme Committee: David Whittaker (University of Sheffield, UK) John Robertson (University of Cambridge, UK) Risto Nieminen (Helsinki University of Technology Finland) Eoin O'Reilly (Tyndall Institute Cork Republic of Ireland) Marco Califano (University of Leeds, UK) Stewart Clark (University of Durham, UK) Stanko Tomić (University of Salford, UK) Mauro Pereira (Sheffield Hallam University, UK) Aldo Di Carlo (University of Rome ''Tor Vergata,'' Italy) Lev Kantorovich (King's College London, UK) Mervin Roy (University of Leicester, UK) Ben Hourahine (University of Strathclyde, UK) Rita Magri (University of Modena and Reggio Emilia, Italy) Zoran Ikonic (University of Leeds) John Barker (University of Glasgow) The proceedings were edited and compiled by Joydeep Pal, Max Migliorato and Stanko Tomić.

  5. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less

  6. Renormalization group analysis of anisotropic diffusion in turbulent shear flows

    NASA Technical Reports Server (NTRS)

    Rubinstein, Robert; Barton, J. Michael

    1991-01-01

    The renormalization group is applied to compute anisotropic corrections to the scalar eddy diffusivity representation of turbulent diffusion of a passive scalar. The corrections are linear in the mean velocity gradients. All model constants are computed theoretically. A form of the theory valid at arbitrary Reynolds number is derived. The theory applies only when convection of the velocity-scalar correlation can be neglected. A ratio of diffusivity components, found experimentally to have a nearly constant value in a variety of shear flows, is computed theoretically for flows in a certain state of equilibrium. The theoretical value is well within the fairly narrow range of experimentally observed values. Theoretical predictions of this diffusivity ratio are also compared with data from experiments and direct numerical simulations of homogeneous shear flows with constant velocity and scalar gradients.

  7. Using the Tower of Hanoi puzzle to infuse your mathematics classroom with computer science concepts

    NASA Astrophysics Data System (ADS)

    Marzocchi, Alison S.

    2016-07-01

    This article suggests that logic puzzles, such as the well-known Tower of Hanoi puzzle, can be used to introduce computer science concepts to mathematics students of all ages. Mathematics teachers introduce their students to computer science concepts that are enacted spontaneously and subconsciously throughout the solution to the Tower of Hanoi puzzle. These concepts include, but are not limited to, conditionals, iteration, and recursion. Lessons, such as the one proposed in this article, are easily implementable in mathematics classrooms and extracurricular programmes as they are good candidates for 'drop in' lessons that do not need to fit into any particular place in the typical curriculum sequence. As an example for readers, the author describes how she used the puzzle in her own Number Sense and Logic course during the federally funded Upward Bound Math/Science summer programme for college-intending low-income high school students. The article explains each computer science term with real-life and mathematical examples, applies each term to the Tower of Hanoi puzzle solution, and describes how students connected the terms to their own solutions of the puzzle. It is timely and important to expose mathematics students to computer science concepts. Given the rate at which technology is currently advancing, and our increased dependence on technology in our daily lives, it has become more important than ever for children to be exposed to computer science. Yet, despite the importance of exposing today's children to computer science, many children are not given adequate opportunity to learn computer science in schools. In the United States, for example, most students finish high school without ever taking a computing course. Mathematics lessons, such as the one described in this article, can help to make computer science more accessible to students who may have otherwise had little opportunity to be introduced to these increasingly important concepts.

  8. A molecular orbital study of the energy spectrum, exchange interaction and gate crosstalk of a four-quantum-dot system

    NASA Astrophysics Data System (ADS)

    Yang, Xu-Chen; Wang, Xin

    The manipulation of coupled quantum dot devices is crucial to scalable, fault-tolerant quantum computation. We present a theoretical study of a four-electron four-quantum-dot system based on molecular orbital methods, which depicts a pair of singlet-triplet (S-T) qubits. We find that while the two S-T qubits are coupled by the capacitive interaction when they are sufficiently far away, the admixture of wave functions undergoes a substantial change as the two S-T qubits get closer. We find that in certain parameter regime the exchange interaction may only be defined in the sense of an effective one when the computational basis states no longer dominate the eigenstates. We further discuss the gate crosstalk as a consequence of this wave function mixing. This work was supported by the Research Grants Council of the Hong Kong Special Administrative Region, China (No. CityU 21300116) and the National Natural Science Foundation of China (No. 11604277).

  9. Advanced interdisciplinary undergraduate program: light engineering

    NASA Astrophysics Data System (ADS)

    Bakholdin, Alexey; Bougrov, Vladislav; Voznesenskaya, Anna; Ezhova, Kseniia

    2016-09-01

    The undergraduate educational program "Light Engineering" of an advanced level of studies is focused on development of scientific learning outcomes and training of professionals, whose activities are in the interdisciplinary fields of Optical engineering and Technical physics. The program gives practical experience in transmission, reception, storage, processing and displaying information using opto-electronic devices, automation of optical systems design, computer image modeling, automated quality control and characterization of optical devices. The program is implemented in accordance with Educational standards of the ITMO University. The specific features of the Program is practice- and problem-based learning implemented by engaging students to perform research and projects, internships at the enterprises and in leading Russian and international research educational centers. The modular structure of the Program and a significant proportion of variable disciplines provide the concept of individual learning for each student. Learning outcomes of the program's graduates include theoretical knowledge and skills in natural science and core professional disciplines, deep knowledge of modern computer technologies, research expertise, design skills, optical and optoelectronic systems and devices.

  10. A Novel Interdisciplinary Approach to Socio-Technical Complexity

    NASA Astrophysics Data System (ADS)

    Bassetti, Chiara

    The chapter presents a novel interdisciplinary approach that integrates micro-sociological analysis into computer-vision and pattern-recognition modeling and algorithms, the purpose being to tackle socio-technical complexity at a systemic yet micro-grounded level. The approach is empirically-grounded and both theoretically- and analytically-driven, yet systemic and multidimensional, semi-supervised and computable, and oriented towards large scale applications. The chapter describes the proposed approach especially as for its sociological foundations, and as applied to the analysis of a particular setting --i.e. sport-spectator crowds. Crowds, better defined as large gatherings, are almost ever-present in our societies, and capturing their dynamics is crucial. From social sciences to public safety management and emergency response, modeling and predicting large gatherings' presence and dynamics, thus possibly preventing critical situations and being able to properly react to them, is fundamental. This is where semi/automated technologies can make the difference. The work presented in this chapter is intended as a scientific step towards such an objective.

  11. Accessible Earth: Enhancing diversity in the Geosciences through accessible course design and Experiential Learning Theory

    NASA Astrophysics Data System (ADS)

    Bennett, Rick; Lamb, Diedre

    2017-04-01

    The tradition of field-based instruction in the geoscience curriculum, which culminates in a capstone geological field camp, presents an insurmountable barrier to many disabled students who might otherwise choose to pursue geoscience careers. There is a widespread perception that success as a practicing geoscientist requires direct access to outcrops and vantage points available only to those able to traverse inaccessible terrain. Yet many modern geoscience activities are based on remotely sensed geophysical data, data analysis, and computation that take place entirely from within the laboratory. To challenge the perception of geoscience as a career option only for the able bodied, we have created the capstone Accessible Earth Study Abroad Program, an alternative to geologic field camp with a focus on modern geophysical observation systems, computational thinking, and data science. In this presentation, we will report on the theoretical bases for developing the course, our experiences in teaching the course to date, and our plan for ongoing assessment, refinement, and dissemination of the effectiveness of our efforts.

  12. The Prospects of Whole Brain Emulation within the next Half- Century

    NASA Astrophysics Data System (ADS)

    Eth, Daniel; Foust, Juan-Carlos; Whale, Brandon

    2013-12-01

    Whole Brain Emulation (WBE), the theoretical technology of modeling a human brain in its entirety on a computer-thoughts, feelings, memories, and skills intact-is a staple of science fiction. Recently, proponents of WBE have suggested that it will be realized in the next few decades. In this paper, we investigate the plausibility of WBE being developed in the next 50 years (by 2063). We identify four essential requisite technologies: scanning the brain, translating the scan into a model, running the model on a computer, and simulating an environment and body. Additionally, we consider the cultural and social effects of WBE. We find the two most uncertain factors for WBE's future to be the development of advanced miniscule probes that can amass neural data in vivo and the degree to which the culture surrounding WBE becomes cooperative or competitive. We identify four plausible scenarios from these uncertainties and suggest the most likely scenario to be one in which WBE is realized, and the technology is used for moderately cooperative ends

  13. Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors

    NASA Astrophysics Data System (ADS)

    Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.

    2012-12-01

    Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.

  14. Quantum field theory and coalgebraic logic in theoretical computer science.

    PubMed

    Basti, Gianfranco; Capolupo, Antonio; Vitiello, Giuseppe

    2017-11-01

    We suggest that in the framework of the Category Theory it is possible to demonstrate the mathematical and logical dual equivalence between the category of the q-deformed Hopf Coalgebras and the category of the q-deformed Hopf Algebras in quantum field theory (QFT), interpreted as a thermal field theory. Each pair algebra-coalgebra characterizes a QFT system and its mirroring thermal bath, respectively, so to model dissipative quantum systems in far-from-equilibrium conditions, with an evident significance also for biological sciences. Our study is in fact inspired by applications to neuroscience where the brain memory capacity, for instance, has been modeled by using the QFT unitarily inequivalent representations. The q-deformed Hopf Coalgebras and the q-deformed Hopf Algebras constitute two dual categories because characterized by the same functor T, related with the Bogoliubov transform, and by its contravariant application T op , respectively. The q-deformation parameter is related to the Bogoliubov angle, and it is effectively a thermal parameter. Therefore, the different values of q identify univocally, and label the vacua appearing in the foliation process of the quantum vacuum. This means that, in the framework of Universal Coalgebra, as general theory of dynamic and computing systems ("labelled state-transition systems"), the so labelled infinitely many quantum vacua can be interpreted as the Final Coalgebra of an "Infinite State Black-Box Machine". All this opens the way to the possibility of designing a new class of universal quantum computing architectures based on this coalgebraic QFT formulation, as its ability of naturally generating a Fibonacci progression demonstrates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Utilization of computer technology by science teachers in public high schools and the impact of standardized testing

    NASA Astrophysics Data System (ADS)

    Priest, Richard Harding

    A significant percentage of high school science teachers are not using computers to teach their students or prepare them for standardized testing. A survey of high school science teachers was conducted to determine how they are having students use computers in the classroom, why science teachers are not using computers in the classroom, which variables were relevant to their not using computers, and what are the effects of standardized testing on the use of technology in the high school science classroom. A self-administered questionnaire was developed to measure these aspects of computer integration and demographic information. A follow-up telephone interview survey of a portion of the original sample was conducted in order to clarify questions, correct misunderstandings, and to draw out more holistic descriptions from the subjects. The primary method used to analyze the quantitative data was frequency distributions. Multiple regression analysis was used to investigate the relationships between the barriers and facilitators and the dimensions of instructional use, frequency, and importance of the use of computers. All high school science teachers in a large urban/suburban school district were sent surveys. A response rate of 58% resulted from two mailings of the survey. It was found that contributing factors to why science teachers do not use computers were not enough up-to-date computers in their classrooms and other educational commitments and duties do not leave them enough time to prepare lessons that include technology. While a high percentage of science teachers thought their school and district administrations were supportive of technology, they also believed more inservice technology training and follow-up activities to support that training are needed and more software needs to be created. The majority of the science teachers do not use the computer to help students prepare for standardized tests because they believe they can prepare students more efficiently without a computer. Nearly half of the teachers, however, gave lack of time to prepare instructional materials and lack of a means to project a computer image to the whole class as reasons they do not use computers. A significant percentage thought science standardized testing was having a negative effect on computer use.

  16. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.

  17. Adsorption and self-assembly of bio-organic molecules at model surfaces: A route towards increased complexity

    NASA Astrophysics Data System (ADS)

    Costa, Dominique; Pradier, Claire-Marie; Tielens, Frederik; Savio, Letizia

    2015-12-01

    Understanding the bio-physical-chemical interactions at nanostructured biointerfaces and the assembly mechanisms of so-called hybrid nano-composites is nowadays a key issue for nanoscience in view of the many possible applications foreseen. The contribution of surface science in this field is noteworthy since, using a bottom-up approach, it allows the investigation of the fundamental processes at the basis of complex interfacial phenomena and thus it helps to unravel the elementary mechanisms governing them. Nowadays it is well demonstrated that a wide variety of different molecular assemblies can form upon adsorption of small biomolecules at surfaces. The geometry of such self-organized structures can often be tuned by a careful control of the experimental conditions during the deposition process. Indeed an impressive number of studies exists (both experimental and - to a lesser extent - theoretical), which demonstrates the ability of molecular self-assembly to create different structural motifs in a more or less predictable manner, by tuning the molecular building blocks as well as the metallic substrate. In this frame, amino acids and small peptides at surfaces are key, basic, systems to be studied. The amino acids structure is simple enough to serve as a model for the chemisorption of biofunctional molecules, but their adsorption at surfaces has applications in surface functionalization, in enantiospecific catalysis, biosensing, shape control of nanoparticles or in emerging fields such as "green" corrosion inhibition. In this paper we review the most recent advances in this field. We shall start from the adsorption of amino acids at metal surfaces and we will evolve then in the direction of more complex systems, in the light of the latest improvements of surface science techniques and of computational methods. On one side, we will focus on amino acids adsorption at oxide surfaces, on the other on peptide adsorption both at metal and oxide substrates. Particular attention will be drawn to the added value provided by the combination of several experimental surface science techniques and to the precious contribution of advanced complementary computational methods to resolve the details of systems of increased complexity. Finally, some hints on experiments performed in presence of water and then characterized in UHV and on the related theoretical work will be presented. This is a further step towards a better approximation of real biological systems. However, since the methods employed are often not typical of surface science, this topic is not developed in detail.

  18. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  19. 77 FR 65417 - Proposal Review Panel for Computing Communication Foundations; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-26

    ...: To assess the progress of the EIC Award, ``Collaborative Research: Computational Behavioral Science... NATIONAL SCIENCE FOUNDATION Proposal Review Panel for Computing Communication Foundations; Notice... National Science Foundation announces the following meeting: Name: Site Visit, Proposal Panel Review for...

  20. Theoretical Definition of Instructor Role in Computer-Managed Instruction.

    ERIC Educational Resources Information Center

    McCombs, Barbara L.; Dobrovolny, Jacqueline L.

    This report describes the results of a theoretical analysis of the ideal role functions of the Computer Managed Instruction (CMI) instructor. Concepts relevant to instructor behavior are synthesized from both cognitive and operant learning theory perspectives, and the roles allocated to instructors by seven large-scale operational CMI systems are…

Top