Cognitive Correlates of Performance in Algorithms in a Computer Science Course for High School
ERIC Educational Resources Information Center
Avancena, Aimee Theresa; Nishihara, Akinori
2014-01-01
Computer science for high school faces many challenging issues. One of these is whether the students possess the appropriate cognitive ability for learning the fundamentals of computer science. Online tests were created based on known cognitive factors and fundamental algorithms and were implemented among the second grade students in the…
NASA Technical Reports Server (NTRS)
Smith, Paul H.
1988-01-01
The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.
Compact Information Representations
2016-08-02
applied computer science, and applied math . Within the scope of this proposal, the focus is preliminarily on the fundamental, theoretical research...Science & Technology • Tung-Lung Wu, now Assistant Professor, Dept. of Math and Stat, Mississippi State Univ 2 Papers In this section, we list the papers...computer science, and applied math . Within the scope of this proposal, the focus is preliminarily on the fundamental, theoretical research which lies in
Creating Science Simulations through Computational Thinking Patterns
ERIC Educational Resources Information Center
Basawapatna, Ashok Ram
2012-01-01
Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…
Quantum Sensors at the Intersections of Fundamental Science, Quantum Information Science & Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chattopadhyay, Swapan; Falcone, Roger; Walsworth, Ronald
Over the last twenty years, there has been a boom in quantum science - i.e., the development and exploitation of quantum systems to enable qualitatively and quantitatively new capabilities, with high-impact applications and fundamental insights that can range across all areas of science and technology.
Toward using games to teach fundamental computer science concepts
NASA Astrophysics Data System (ADS)
Edgington, Jeffrey Michael
Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.
Evolution of an Intelligent Deductive Logic Tutor Using Data-Driven Elements
ERIC Educational Resources Information Center
Mostafavi, Behrooz; Barnes, Tiffany
2017-01-01
Deductive logic is essential to a complete understanding of computer science concepts, and is thus fundamental to computer science education. Intelligent tutoring systems with individualized instruction have been shown to increase learning gains. We seek to improve the way deductive logic is taught in computer science by developing an intelligent,…
NASA Astrophysics Data System (ADS)
Mayne, Richard
2015-03-01
Slime mould computing is an inherently multi-disciplinary subfield of unconventional computing that draws upon aspects of not only theoretical computer science and electronics, but also the natural sciences. This chapter focuses on the biology of slime moulds and expounds the viewpoint that a deep, intuitive understanding of slime mould life processes is a fundamental requirement for understanding -- and, hence, harnessing -- the incredible behaviour patterns we may characterise as "computation"...
Argonne Chemical Sciences & Engineering - Awards Home
Argonne National Laboratory Chemical Sciences & Engineering DOE Logo CSE Home About CSE Argonne Home > Chemical Sciences & Engineering > Fundamental Interactions Catalysis & Energy Computational Postdoctoral Fellowships Contact Us CSE Intranet Awards Argonne's Chemical Sciences and
Learning technologies and the cyber-science classroom
NASA Astrophysics Data System (ADS)
Houlihan, Gerard
Access to computer and communication technology has long been regarded `part-and-parcel' of a good education. No educator can afford to ignore the profound impact of learning technologies on the way we teach science, nor fail to acknowledge that information literacy and computing skills will be fundamental to the practice of science in the next millennium. Nevertheless, there is still confusion concerning what technologies educators should employ in teaching science. Furthermore, a lack of knowledge combined with the pressures to be `seen' utilizing technology has lead some schools to waste scarce resources in a `grab-bag' attitude towards computers and technology. Such popularized `wish lists' can only drive schools to accumulate expensive equipment for no real learning purpose. In the future educators will have to reconsider their curriculum and pedagogy with a focus on the learning environment before determining what appropriate computing resources to acquire. This will be fundamental to the capabilities of science classrooms to engage with cutting-edge issues in science. This session will demonstrate the power of a broad range of learning technologies to enhance science education. The aim is to explore classroom possibilities as well as to provide a basic introduction to technical aspects of various software and hardware applications, including robotics and dataloggers and simulation software.
ERIC Educational Resources Information Center
Fofonoff, N. P.; Millard, R. C., Jr.
Algorithms for computation of fundamental properties of seawater, based on the practicality salinity scale (PSS-78) and the international equation of state for seawater (EOS-80), are compiled in the present report for implementing and standardizing computer programs for oceanographic data processing. Sample FORTRAN subprograms and tables are given…
ERIC Educational Resources Information Center
Avancena, Aimee Theresa; Nishihara, Akinori; Vergara, John Paul
2012-01-01
This paper presents the online cognitive and algorithm tests, which were developed in order to determine if certain cognitive factors and fundamental algorithms correlate with the performance of students in their introductory computer science course. The tests were implemented among Management Information Systems majors from the Philippines and…
ERIC Educational Resources Information Center
Psycharis, Sarantos
2016-01-01
Computational experiment approach considers models as the fundamental instructional units of Inquiry Based Science and Mathematics Education (IBSE) and STEM Education, where the model take the place of the "classical" experimental set-up and simulation replaces the experiment. Argumentation in IBSE and STEM education is related to the…
ERIC Educational Resources Information Center
Ernst, Jeremy V.; Clark, Aaron C.
2012-01-01
In 2009, the North Carolina Virtual Public Schools worked with researchers at the William and Ida Friday Institute to produce and evaluate the use of game creation by secondary students as a means for learning content related to career awareness in Science, Technology, Engineering and Mathematics (STEM) disciplines, with particular emphasis in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikolic, R J
This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less
Computational Thinking: A Digital Age Skill for Everyone
ERIC Educational Resources Information Center
Barr, David; Harrison, John; Conery, Leslie
2011-01-01
In a seminal article published in 2006, Jeanette Wing described computational thinking (CT) as a way of "solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science." Wing's article gave rise to an often controversial discussion and debate among computer scientists,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cramer, Christopher J.
Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.
Philip A. Loring; F. Stuart Chapin; S. Craig Gerlach
2008-01-01
Computational thinking (CT) is a way to solve problems and understand complex systems that draws on concepts fundamental to computer science and is well suited to the challenges that face researchers of complex, linked social-ecological systems. This paper explores CT's usefulness to sustainability science through the application of the services-oriented...
Students' Misconceptions about Medium-Scale Integrated Circuits
ERIC Educational Resources Information Center
Herman, G. L.; Loui, M. C.; Zilles, C.
2011-01-01
To improve instruction in computer engineering and computer science, instructors must better understand how their students learn. Unfortunately, little is known about how students learn the fundamental concepts in computing. To investigate student conceptions and misconceptions about digital logic concepts, the authors conducted a qualitative…
Toward Using Games to Teach Fundamental Computer Science Concepts
ERIC Educational Resources Information Center
Edgington, Jeffrey Michael
2010-01-01
Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. …
Tadmor, Brigitta; Tidor, Bruce
2005-09-01
Progress in the life sciences, including genome sequencing and high-throughput experimentation, offers an opportunity for understanding biology and medicine from a systems perspective. This 'new view', which complements the more traditional component-based approach, involves the integration of biological research with approaches from engineering disciplines and computer science. The result is more than a new set of technologies. Rather, it promises a fundamental reconceptualization of the life sciences based on the development of quantitative and predictive models to describe crucial processes. To achieve this change, learning communities are being formed at the interface of the life sciences, engineering and computer science. Through these communities, research and education will be integrated across disciplines and the challenges associated with multidisciplinary team-based science will be addressed.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Design Science in Human-Computer Interaction: A Model and Three Examples
ERIC Educational Resources Information Center
Prestopnik, Nathan R.
2013-01-01
Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Wooley; Herbert S. Lin
This study is the first comprehensive NRC study that suggests a high-level intellectual structure for Federal agencies for supporting work at the biology/computing interface. The report seeks to establish the intellectual legitimacy of a fundamentally cross-disciplinary collaboration between biologists and computer scientists. That is, while some universities are increasingly favorable to research at the intersection, life science researchers at other universities are strongly impeded in their efforts to collaborate. This report addresses these impediments and describes proven strategies for overcoming them. An important feature of the report is the use of well-documented examples that describe clearly to individuals not trainedmore » in computer science the value and usage of computing across the biological sciences, from genes and proteins to networks and pathways, from organelles to cells, and from individual organisms to populations and ecosystems. It is hoped that these examples will be useful to students in the life sciences to motivate (continued) study in computer science that will enable them to be more facile users of computing in their future biological studies.« less
2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions
2017-12-21
modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data
The Concept of Nondeterminism: Its Development and Implications for Teaching
ERIC Educational Resources Information Center
Armoni, Michal; Ben-Ari, Mordechai
2009-01-01
Nondeterminism is a fundamental concept in computer science that appears in various contexts such as automata theory, algorithms and concurrent computation. We present a taxonomy of the different ways that nondeterminism can be defined and used; the categories of the taxonomy are domain, nature, implementation, consistency, execution and…
Information technology challenges of biodiversity and ecosystems informatics
Schnase, J.L.; Cushing, J.; Frame, M.; Frondorf, A.; Landis, E.; Maier, D.; Silberschatz, A.
2003-01-01
Computer scientists, biologists, and natural resource managers recently met to examine the prospects for advancing computer science and information technology research by focusing on the complex and often-unique challenges found in the biodiversity and ecosystem domain. The workshop and its final report reveal that the biodiversity and ecosystem sciences are fundamentally information sciences and often address problems having distinctive attributes of scale and socio-technical complexity. The paper provides an overview of the emerging field of biodiversity and ecosystem informatics and demonstrates how the demands of biodiversity and ecosystem research can advance our understanding and use of information technologies.
Visualization and Interactivity in the Teaching of Chemistry to Science and Non-Science Students
ERIC Educational Resources Information Center
Venkataraman, Bhawani
2009-01-01
A series of interactive, instructional units have been developed that integrate computational molecular modelling and visualization to teach fundamental chemistry concepts and the relationship between the molecular and macro-scales. The units span the scale from atoms, small molecules to macromolecular systems, and introduce many of the concepts…
Computational Exposure Science: An Emerging Discipline to ...
Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source
More Than the Rules of Precedence
ERIC Educational Resources Information Center
Liang, Yawei
2005-01-01
In a fundamental computer-programming course, such as CSE101, questions about how to evaluate an arithmetic expression are frequently used to check if our students know the rules of precedence. The author uses two of our final examination questions to show that more knowledge of computer science is needed to answer them correctly. Furthermore,…
Integrating Computational Thinking into Technology and Engineering Education
ERIC Educational Resources Information Center
Hacker, Michael
2018-01-01
Computational Thinking (CT) is being promoted as "a fundamental skill used by everyone in the world by the middle of the 21st Century" (Wing, 2006). CT has been effectively integrated into history, ELA, mathematics, art, and science courses (Settle, et al., 2012). However, there has been no analogous effort to integrate CT into…
A Project-Based Learning Setting to Human-Computer Interaction for Teenagers
ERIC Educational Resources Information Center
Geyer, Cornelia; Geisler, Stefan
2012-01-01
Knowledge of fundamentals of human-computer interaction resp. usability engineering is getting more and more important in technical domains. However this interdisciplinary field of work and corresponding degree programs are not broadly known. Therefore at the Hochschule Ruhr West, University of Applied Sciences, a program was developed to give…
Biomedical wellness challenges and opportunities
NASA Astrophysics Data System (ADS)
Tangney, John F.
2012-06-01
The mission of ONR's Human and Bioengineered Systems Division is to direct, plan, foster, and encourage Science and Technology in cognitive science, computational neuroscience, bioscience and bio-mimetic technology, social/organizational science, training, human factors, and decision making as related to future Naval needs. This paper highlights current programs that contribute to future biomedical wellness needs in context of humanitarian assistance and disaster relief. ONR supports fundamental research and related technology demonstrations in several related areas, including biometrics and human activity recognition; cognitive sciences; computational neurosciences and bio-robotics; human factors, organizational design and decision research; social, cultural and behavioral modeling; and training, education and human performance. In context of a possible future with automated casualty evacuation, elements of current science and technology programs are illustrated.
Fundamentals and Recent Developments in Approximate Bayesian Computation
Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka
2017-01-01
Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas; Schuman, Catherine; Patton, Robert
The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less
Unconscious Bias in the Classroom: Evidence and Opportunities, 2017
ERIC Educational Resources Information Center
Dee, T.; Gershenson, S.
2017-01-01
The underrepresentation of women and racial and ethnic minorities in computer science (CS) and other fields of science, technology, engineering, and math (STEM) is a serious impediment to technological innovation as well as an affront to fundamental notions of fairness and equity. These gaps emerge in the early grades and tend to persist, if not…
ERIC Educational Resources Information Center
Fairer-Wessels, Felicite A.
Within the South African tertiary education context, information management is taught from a variety of perspectives, including computer science, business management, informatics, and library and information science. Each discipline has a particular multidisciplinary focus dealing with its fundamentals. To investigate information management…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arkin, Adam; Bader, David C.; Coffey, Richard
Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less
Information processing, computation, and cognition.
Piccinini, Gualtiero; Scarantino, Andrea
2011-01-01
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.
Social Significance of Fundamental Science Common to all Mankind
NASA Astrophysics Data System (ADS)
Zel'Dovich, Ya. B.
It is a challenge of science to play a great role in solution of the problem of meeting material and spiritual human demands. The argument is known that science has become a productive force. When characterizing economy of one or another country or region, it is a practice to speak about science-intensive works, i.e., those where production and competitiveness are directly related to a science level. The science-intensive works include, for example, production of microelectronic circuits and their application in computer and information science or production of pharmaceutical preparations using gene engineering. This list could be continued indefinitely…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-03-01
Abstracts of papers published during the previous calendar year, arranged in accordance with the project titles used in the USDOE Schedule 189 Budget Proposals, are presented. The collection of abstracts supplements the listing of papers published in the Schedule 189. The following subject areas are represented: high-energy physics; nuclear physics; basic energy sciences (nuclear science, materials sciences, solid state physics, materials chemistry); molecular, mathematical, and earth sciences (fundamental interactions, processes and techniques, mathematical and computer sciences); environmental research and development; physical and technological studies (characterization, measurement and monitoring); and nuclear research and applications.
NASA Astrophysics Data System (ADS)
Clay, Alexis; Delord, Elric; Couture, Nadine; Domenger, Gaël
We describe the joint research that we conduct in gesture-based emotion recognition and virtual augmentation of a stage, bridging together the fields of computer science and dance. After establishing a common ground for dialogue, we could conduct a research process that equally benefits both fields. As computer scientists, dance is a perfect application case. Dancer's artistic creativity orient our research choices. As dancers, computer science provides new tools for creativity, and more importantly a new point of view that forces us to reconsider dance from its fundamentals. In this paper we hence describe our scientific work and its implications on dance. We provide an overview of our system to augment a ballet stage, taking a dancer's emotion into account. To illustrate our work in both fields, we describe three events that mixed dance, emotion recognition and augmented reality.
NASA Astrophysics Data System (ADS)
Cirac, J. Ignacio; Kimble, H. Jeff
2017-01-01
Quantum optics is a well-established field that spans from fundamental physics to quantum information science. In the coming decade, areas including computation, communication and metrology are all likely to experience scientific and technological advances supported by this far-reaching research field.
Computational sciences in the upstream oil and gas industry
Halsey, Thomas C.
2016-01-01
The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785
Biomaterial science meets computational biology.
Hutmacher, Dietmar W; Little, J Paige; Pettet, Graeme J; Loessner, Daniela
2015-05-01
There is a pressing need for a predictive tool capable of revealing a holistic understanding of fundamental elements in the normal and pathological cell physiology of organoids in order to decipher the mechanoresponse of cells. Therefore, the integration of a systems bioengineering approach into a validated mathematical model is necessary to develop a new simulation tool. This tool can only be innovative by combining biomaterials science with computational biology. Systems-level and multi-scale experimental data are incorporated into a single framework, thus representing both single cells and collective cell behaviour. Such a computational platform needs to be validated in order to discover key mechano-biological factors associated with cell-cell and cell-niche interactions.
Integrating Intelligent Systems Domain Knowledge Into the Earth Science Curricula
NASA Astrophysics Data System (ADS)
Güereque, M.; Pennington, D. D.; Pierce, S. A.
2017-12-01
High-volume heterogeneous datasets are becoming ubiquitous, migrating to center stage over the last ten years and transcending the boundaries of computationally intensive disciplines into the mainstream, becoming a fundamental part of every science discipline. Despite the fact that large datasets are now pervasive across industries and academic disciplines, the array of skills is generally absent from earth science programs. This has left the bulk of the student population without access to curricula that systematically teach appropriate intelligent-systems skills, creating a void for skill sets that should be universal given their need and marketability. While some guidance regarding appropriate computational thinking and pedagogy is appearing, there exist few examples where these have been specifically designed and tested within the earth science domain. Furthermore, best practices from learning science have not yet been widely tested for developing intelligent systems-thinking skills. This research developed and tested evidence based computational skill modules that target this deficit with the intention of informing the earth science community as it continues to incorporate intelligent systems techniques and reasoning into its research and classrooms.
Regime, phase and paradigm shifts: making community ecology the basic science for fisheries
Mangel, Marc; Levin, Phillip S.
2005-01-01
Modern fishery science, which began in 1957 with Beverton and Holt, is ca. 50 years old. At its inception, fishery science was limited by a nineteenth century mechanistic worldview and by computational technology; thus, the relatively simple equations of population ecology became the fundamental ecological science underlying fisheries. The time has come for this to change and for community ecology to become the fundamental ecological science underlying fisheries. This point will be illustrated with two examples. First, when viewed from a community perspective, excess production must be considered in the context of biomass left for predators. We argue that this is a better measure of the effects of fisheries than spawning biomass per recruit. Second, we shall analyse a simple, but still multi-species, model for fishery management that considers the alternatives of harvest regulations, inshore marine protected areas and offshore marine protected areas. Population or community perspectives lead to very different predictions about the efficacy of reserves. PMID:15713590
Sandia National Laboratories: National Security Missions: Nuclear Weapons
Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New , in which fundamental science, computer models, and unique experimental facilities come together so
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spangler, Lee; Cunningham, Alfred; Lageson, David
2011-03-31
ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.
Information processing, computation, and cognition
Scarantino, Andrea
2010-01-01
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958
Algorithmics - Is There Hope for a Unified Theory?
NASA Astrophysics Data System (ADS)
Hromkovič, Juraj
Computer science was born with the formal definition of the notion of an algorithm. This definition provides clear limits of automatization, separating problems into algorithmically solvable problems and algorithmically unsolvable ones. The second big bang of computer science was the development of the concept of computational complexity. People recognized that problems that do not admit efficient algorithms are not solvable in practice. The search for a reasonable, clear and robust definition of the class of practically solvable algorithmic tasks started with the notion of the class {P} and of {NP}-completeness. In spite of the fact that this robust concept is still fundamental for judging the hardness of computational problems, a variety of approaches was developed for solving instances of {NP}-hard problems in many applications. Our 40-years short attempt to fix the fuzzy border between the practically solvable problems and the practically unsolvable ones partially reminds of the never-ending search for the definition of "life" in biology or for the definitions of matter and energy in physics. Can the search for the formal notion of "practical solvability" also become a never-ending story or is there hope for getting a well-accepted, robust definition of it? Hopefully, it is not surprising that we are not able to answer this question in this invited talk. But to deal with this question is of crucial importance, because only due to enormous effort scientists get a better and better feeling of what the fundamental notions of science like life and energy mean. In the flow of numerous technical results, we must not forget the fact that most of the essential revolutionary contributions to science were done by defining new concepts and notions.
Density functional theory in materials science.
Neugebauer, Jörg; Hickel, Tilmann
2013-09-01
Materials science is a highly interdisciplinary field. It is devoted to the understanding of the relationship between (a) fundamental physical and chemical properties governing processes at the atomistic scale with (b) typically macroscopic properties required of materials in engineering applications. For many materials, this relationship is not only determined by chemical composition, but strongly governed by microstructure. The latter is a consequence of carefully selected process conditions (e.g., mechanical forming and annealing in metallurgy or epitaxial growth in semiconductor technology). A key task of computational materials science is to unravel the often hidden composition-structure-property relationships using computational techniques. The present paper does not aim to give a complete review of all aspects of materials science. Rather, we will present the key concepts underlying the computation of selected material properties and discuss the major classes of materials to which they are applied. Specifically, our focus will be on methods used to describe single or polycrystalline bulk materials of semiconductor, metal or ceramic form.
Long live the Data Scientist, but can he/she persist?
NASA Astrophysics Data System (ADS)
Wyborn, L. A.
2011-12-01
In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.
Real science at the petascale.
Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V
2009-06-28
We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.
Materials science. Materials that couple sensing, actuation, computation, and communication.
McEvoy, M A; Correll, N
2015-03-20
Tightly integrating sensing, actuation, and computation into composites could enable a new generation of truly smart material systems that can change their appearance and shape autonomously. Applications for such materials include airfoils that change their aerodynamic profile, vehicles with camouflage abilities, bridges that detect and repair damage, or robotic skins and prosthetics with a realistic sense of touch. Although integrating sensors and actuators into composites is becoming increasingly common, the opportunities afforded by embedded computation have only been marginally explored. Here, the key challenge is the gap between the continuous physics of materials and the discrete mathematics of computation. Bridging this gap requires a fundamental understanding of the constituents of such robotic materials and the distributed algorithms and controls that make these structures smart. Copyright © 2015, American Association for the Advancement of Science.
Computational Physics in a Nutshell
NASA Astrophysics Data System (ADS)
Schillaci, Michael
2001-11-01
Too often students of science are expected to ``pick-up'' what they need to know about the Art of Science. A description of the two-semester Computational Physics course being taught by the author offers a remedy to this situation. The course teaches students the three pillars of modern scientific research: Problem Solving, Programming, and Presentation. Using FORTRAN, LaTeXe, MAPLE V, HTML, and JAVA, students learn the fundamentals of algorithm development, how to implement classes and packages written by others, how to produce publication quality graphics and documents and how to publish them on the world-wide-web. The course content is outlined and project examples are offered.
Fundamental Fortran for Social Scientists.
ERIC Educational Resources Information Center
Veldman, Donald J.
An introduction to Fortran programming specifically for social science statistical and routine data processing is provided. The first two sections of the manual describe the components of computer hardware and software. Topics include input, output, and mass storage devices; central memory; central processing unit; internal storage of data; and…
Amplify scientific discovery with artificial intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gil, Yolanda; Greaves, Mark T.; Hendler, James
Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automatedmore » language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almgren, Ann; DeMar, Phil; Vetter, Jeffrey
The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less
NASA Astrophysics Data System (ADS)
Chang, Li-Na; Luo, Shun-Long; Sun, Yuan
2017-11-01
The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182
An Animated Introduction to Relational Databases for Many Majors
ERIC Educational Resources Information Center
Dietrich, Suzanne W.; Goelman, Don; Borror, Connie M.; Crook, Sharon M.
2015-01-01
Database technology affects many disciplines beyond computer science and business. This paper describes two animations developed with images and color that visually and dynamically introduce fundamental relational database concepts and querying to students of many majors. The goal is for educators in diverse academic disciplines to incorporate the…
Constructive Models of Discrete and Continuous Physical Phenomena
2014-02-08
BOURKE , T., CAILLAUD, B., AND POUZET, M. The fundamentals of hybrid systems modelers. Journal of Computer and System Sciences 78, 3 (2012), 877–910...8. BENVENISTE, A., BOURKE , T., CAILLAUD, B., AND POUZET, M. Index theory for hy- brid DAE systems (abstract and slides). In Synchronous Programming
This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...
While knowledge of exposure is fundamental to assessing and mitigating risks, exposure information has been costly and difficult to generate. Driven by major scientific advances in analytical methods, biomonitoring, computational tools, and a newly articulated vision for a great...
Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT
ERIC Educational Resources Information Center
Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin
2013-01-01
Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…
2002-01-01
behaviors are influenced by social interactions, and to how modern IT sys- tems should be designed to support these group technical activities. The...engineering disciplines to behavior, decision, psychology, organization, and the social sciences. “Conflict manage- ment activity in collaborative...Researchers instead began to search for an entirely new paradigm, starting from a theory in social science, to construct a conceptual framework to describe
Richard Feynman and computation
NASA Astrophysics Data System (ADS)
Hey, Tony
1999-04-01
The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.
Mechanical Computing Redux: Limitations at the Nanoscale
NASA Astrophysics Data System (ADS)
Liu, Tsu-Jae King
2014-03-01
Technology solutions for overcoming the energy efficiency limits of nanoscale complementary metal oxide semiconductor (CMOS) technology ultimately will be needed in order to address the growing issue of integrated-circuit chip power density. Off-state leakage current sets a fundamental lower limit in energy per operation for any voltage-level-based digital logic implemented with transistors (CMOS and beyond), which leads to practical limits for device density (i.e. cost) and operating frequency (i.e. system performance). Mechanical switches have zero off-state leakag and hence can overcome this fundamental limit. Contact adhesive force sets a lower limit for the switching energy of a mechanical switch, however, and also directly impacts its performance. This paper will review recent progress toward the development of nano-electro-mechanical relay technology and discuss remaining challenges for realizing the promise of mechanical computing for ultra-low-power computing. Supported by the Center for Energy Efficient Electronics Science (NSF Award 0939514).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattsson, Ann E.
Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less
Information Technology for the Twenty-First Century: A Bold Investment in America's Future
NASA Astrophysics Data System (ADS)
1999-06-01
With this Information Technology for the Twenty First Century (IT2) initiative, the Federal Government is making an important re-commitment to fundamental research in information technology. The IT2 initiative proposes 366 million in increased investments in computing, information, and communications research and development (R&D) to help expand the knowledge base in fundamental information science, advance the Nations capabilities in cutting edge research, and train the next generation of researchers who will sustain the Information Revolution well into the 21st Century.
Dragonfly: strengthening programming skills by building a game engine from scratch
NASA Astrophysics Data System (ADS)
Claypool, Mark
2013-06-01
Computer game development has been shown to be an effective hook for motivating students to learn both introductory and advanced computer science topics. While games can be made from scratch, to simplify the programming required game development often uses game engines that handle complicated or frequently used components of the game. These game engines present the opportunity to strengthen programming skills and expose students to a range of fundamental computer science topics. While educational efforts have been effective in using game engines to improve computer science education, there have been no published papers describing and evaluating students building a game engine from scratch as part of their course work. This paper presents the Dragonfly-approach in which students build a fully functional game engine from scratch and make a game using their engine as part of a junior-level course. Details on the programming projects are presented, as well as an evaluation of the results from two offerings that used Dragonfly. Student performance on the projects as well as student assessments demonstrates the efficacy of having students build a game engine from scratch in strengthening their programming skills.
ERIC Educational Resources Information Center
Hsu, Ting-Chia
2018-01-01
To stimulate classroom interactions, this study employed two different smartphone application modes, providing an additional instant interaction channel in a flipped classroom teaching fundamental computer science concepts. One instant interaction mode provided the students (N = 36) with anonymous feedback in chronological time sequence, while the…
Adding Interactivity to a Non-Interative Class
ERIC Educational Resources Information Center
Rogers, Gary; Krichen, Jack
2004-01-01
The IT 3050 course at Capella University is an introduction to fundamental computer networking. This course is one of the required courses in the Bachelor of Science in Information Technology program. In order to provide a more enriched learning environment for learners, Capella has significantly modified this class (and others) by infusing it…
A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software
NASA Astrophysics Data System (ADS)
Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.
2017-10-01
Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
Theoretical computer science and the natural sciences
NASA Astrophysics Data System (ADS)
Marchal, Bruno
2005-12-01
I present some fundamental theorems in computer science and illustrate their relevance in Biology and Physics. I do not assume prerequisites in mathematics or computer science beyond the set N of natural numbers, functions from N to N, the use of some notational conveniences to describe functions, and at some point, a minimal amount of linear algebra and logic. I start with Cantor's transcendental proof by diagonalization of the non enumerability of the collection of functions from natural numbers to the natural numbers. I explain why this proof is not entirely convincing and show how, by restricting the notion of function in terms of discrete well defined processes, we are led to the non algorithmic enumerability of the computable functions, but also-through Church's thesis-to the algorithmic enumerability of partial computable functions. Such a notion of function constitutes, with respect to our purpose, a crucial generalization of that concept. This will make easy to justify deep and astonishing (counter-intuitive) incompleteness results about computers and similar machines. The modified Cantor diagonalization will provide a theory of concrete self-reference and I illustrate it by pointing toward an elementary theory of self-reproduction-in the Amoeba's way-and cellular self-regeneration-in the flatworm Planaria's way. To make it easier, I introduce a very simple and powerful formal system known as the Schoenfinkel-Curry combinators. I will use the combinators to illustrate in a more concrete way the notion introduced above. The combinators, thanks to their low-level fine grained design, will also make it possible to make a rough but hopefully illuminating description of the main lessons gained by the careful observation of nature, and to describe some new relations, which should exist between computer science, the science of life and the science of inert matter, once some philosophical, if not theological, hypotheses are made in the cognitive sciences. In the last section, I come back to self-reference and I give an exposition of its modal logics. This is used to show that theoretical computer science makes those “philosophical hypotheses” in theoretical cognitive science experimentally and mathematically testable.
Pacific Northwest National Laboratory Annual Site Environmental Report for Calendar Year 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duncan, Joanne P.; Sackschewsky, Michael R.; Tilden, Harold T.
2014-09-30
Pacific Northwest National Laboratory (PNNL), one of the U.S. Department of Energy (DOE) Office of Science’s 10 national laboratories, provides innovative science and technology development in the areas of energy and the environment, fundamental and computational science, and national security. DOE’s Pacific Northwest Site Office (PNSO) is responsible for oversight of PNNL at its Campus in Richland, Washington, as well as its facilities in Sequim, Seattle, and North Bonneville, Washington, and Corvallis and Portland, Oregon.
Visual design for the user interface, Part 1: Design fundamentals.
Lynch, P J
1994-01-01
Digital audiovisual media and computer-based documents will be the dominant forms of professional communication in both clinical medicine and the biomedical sciences. The design of highly interactive multimedia systems will shortly become a major activity for biocommunications professionals. The problems of human-computer interface design are intimately linked with graphic design for multimedia presentations and on-line document systems. This article outlines the history of graphic interface design and the theories that have influenced the development of today's major graphic user interfaces.
Data handling and visualization for NASA's science programs
NASA Technical Reports Server (NTRS)
Bredekamp, Joseph H. (Editor)
1995-01-01
Advanced information systems capabilities are essential to conducting NASA's scientific research mission. Access to these capabilities is no longer a luxury for a select few within the science community, but rather an absolute necessity for carrying out scientific investigations. The dependence on high performance computing and networking, as well as ready and expedient access to science data, metadata, and analysis tools is the fundamental underpinning for the entire research endeavor. At the same time, advances in the whole range of information technologies continues on an almost explosive growth path, reaching beyond the research community to affect the population as a whole. Capitalizing on and exploiting these advances are critical to the continued success of space science investigations. NASA must remain abreast of developments in the field and strike an appropriate balance between being a smart buyer and a direct investor in the technology which serves its unique requirements. Another key theme deals with the need for the space and computer science communities to collaborate as partners to more fully realize the potential of information technology in the space science research environment.
NASA Astrophysics Data System (ADS)
Anderson, O. Roger
The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.
Cellular automaton supercomputing
NASA Technical Reports Server (NTRS)
Wolfram, Stephen
1987-01-01
Many of the models now used in science and engineering are over a century old. And most of them can be implemented on modern digital computers only with considerable difficulty. Some new basic models are discussed which are much more directly suitable for digital computer simulation. The fundamental principle is that the models considered herein are as suitable as possible for implementation on digital computers. It is then a matter of scientific analysis to determine whether such models can reproduce the behavior seen in physical and other systems. Such analysis was carried out in several cases, and the results are very encouraging.
Data stewardship - a fundamental part of the scientific method (Invited)
NASA Astrophysics Data System (ADS)
Foster, C.; Ross, J.; Wyborn, L. A.
2013-12-01
This paper emphasises the importance of data stewardship as a fundamental part of the scientific method, and the need to effect cultural change to ensure engagement by earth scientists. It is differentiated from the science of data stewardship per se. Earth System science generates vast quantities of data, and in the past, data analysis has been constrained by compute power, such that sub-sampling of data often provided the only way to reach an outcome. This is analogous to Kahneman's System 1 heuristic, with its simplistic and often erroneous outcomes. The development of HPC has liberated earth sciences such that the complexity and heterogeneity of natural systems can be utilised in modelling at any scale, global, or regional, or local; for example, movement of crustal fluids. Paradoxically, now that compute power is available, it is the stewardship of the data that is presenting the main challenges. There is a wide spectrum of issues: from effectively handling and accessing acquired data volumes [e.g. satellite feeds per day/hour]; through agreed taxonomy to effect machine to machine analyses; to idiosyncratic approaches by individual scientists. Except for the latter, most agree that data stewardship is essential. Indeed it is an essential part of the science workflow. As science struggles to engage and inform on issues of community importance, such as shale gas and fraccing, all parties must have equal access to data used for decision making; without that, there will be no social licence to operate or indeed access to additional science funding (Heidorn, 2008). The stewardship of scientific data is an essential part of the science process; but often it is regarded, wrongly, as entirely in the domain of data custodians or stewards. Geoscience Australia has developed a set of six principles that apply to all science activities within the agency: Relevance to Government Collaborative science Quality science Transparent science Communicated science Sustained science capability Every principle includes data stewardship: this is to effect cultural change at both collective and individual levels to ensure that our science outcomes and technical advice are effective for the Government and community.
On Teaching Abstraction in Computer Science to Novices
ERIC Educational Resources Information Center
Armoni, Michal
2013-01-01
Abstraction is a key concept in CS, one of the most fundamental ideas underlying CS and its practice. However, teaching this soft concept to novices is a very difficult task, as discussed by many CSE experts. This paper discusses this issue, and suggests a general framework for teaching abstraction in CS to novices, a framework that would fit into…
Computational complexity of ecological and evolutionary spatial dynamics
Ibsen-Jensen, Rasmus; Chatterjee, Krishnendu; Nowak, Martin A.
2015-01-01
There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. There are classes of harder problems for which the fastest possible algorithm requires exponential time. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. Precise complexity results can be notoriously difficult. The famous question whether polynomial time equals nondeterministic polynomial time (i.e., P = NP) is one of the hardest open problems in computer science and all of mathematics. Here, we consider simple processes of ecological and evolutionary spatial dynamics. The basic question is: What is the probability that a new invader (or a new mutant) will take over a resident population? We derive precise complexity results for a variety of scenarios. We therefore show that some fundamental questions in this area cannot be answered by simple equations (assuming that P is not equal to NP). PMID:26644569
Visual analytics as a translational cognitive science.
Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard
2011-07-01
Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.
Networking Technologies Enable Advances in Earth Science
NASA Technical Reports Server (NTRS)
Johnson, Marjory; Freeman, Kenneth; Gilstrap, Raymond; Beck, Richard
2004-01-01
This paper describes an experiment to prototype a new way of conducting science by applying networking and distributed computing technologies to an Earth Science application. A combination of satellite, wireless, and terrestrial networking provided geologists at a remote field site with interactive access to supercomputer facilities at two NASA centers, thus enabling them to validate and calibrate remotely sensed geological data in near-real time. This represents a fundamental shift in the way that Earth scientists analyze remotely sensed data. In this paper we describe the experiment and the network infrastructure that enabled it, analyze the data flow during the experiment, and discuss the scientific impact of the results.
NASA Advanced Supercomputing Facility Expansion
NASA Technical Reports Server (NTRS)
Thigpen, William W.
2017-01-01
The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.
GSDC: A Unique Data Center in Korea for HEP research
NASA Astrophysics Data System (ADS)
Ahn, Sang-Un
2017-04-01
Global Science experimental Data hub Center (GSDC) at Korea Institute of Science and Technology Information (KISTI) is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT) and the infrastructure for High Performance Computing (HPC), High Throughput Computing (HTC) and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.
NASA Astrophysics Data System (ADS)
Neves, Rui Gomes; Teodoro, Vítor Duarte
2012-09-01
A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.
NASA's Technology Transfer Program for the Early Detection of Breast Cancer
NASA Technical Reports Server (NTRS)
Schmidt, Gregory; Frey, Mary Anne; Vernikos, Joan; Winfield, Daniel; Dalton, Bonnie P. (Technical Monitor)
1996-01-01
The National Aeronautics and Space Administration (NASA) has led the development of advanced imaging sensors and image processing technologies for space science and Earth science missions. NASA considers the transfer and commercialization of such technologies a fundamental mission of the agency. Over the last two years, efforts have been focused on the application of aerospace imaging and computing to the field of diagnostic imaging, specifically to breast cancer imaging. These technology transfer efforts offer significant promise in helping in the national public health priority of the early detection of breast cancer.
1984-06-01
computer. The testing purposes is both expensive and time failure criterion is basically the effective comsuming , making it more difficult to obtain... behavior of a structure in terms of do critical review on a science because a its normal modes. The fundamental *science is something that ia fact... behavior expressed in some simple sort of rules of living in the Garden of Eden; they characteristic, and a deflected shape of each could eat from any
NASA Astrophysics Data System (ADS)
Gusev, A.; Trudkova, N.
2017-09-01
Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.
The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking
NASA Astrophysics Data System (ADS)
Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng
Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).
Integrating Grid Services into the Cray XT4 Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy
2009-05-01
The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic gridmore » interfaces that mask the underlying system-specific details for the end user.« less
ERIC Educational Resources Information Center
Kiesmuller, Ulrich
2009-01-01
At schools special learning and programming environments are often used in the field of algorithms. Particularly with regard to computer science lessons in secondary education, they are supposed to help novices to learn the basics of programming. In several parts of Germany (e.g., Bavaria) these fundamentals are taught as early as in the seventh…
Andrew J. Dennhardt; Adam E. Duerr; David Brandes; Todd E. Katzner
2015-01-01
Estimating population size is fundamental to conservation and management. Population size is typically estimated using survey data, computer models, or both. Some of the most extensive and often least expensive survey data are those collected by citizen-scientists. A challenge to citizen-scientists is that the vagility of many organisms can complicate data collection....
NASA Technical Reports Server (NTRS)
Biswas, Rupak
2018-01-01
Quantum computing promises an unprecedented ability to solve intractable problems by harnessing quantum mechanical effects such as tunneling, superposition, and entanglement. The Quantum Artificial Intelligence Laboratory (QuAIL) at NASA Ames Research Center is the space agency's primary facility for conducting research and development in quantum information sciences. QuAIL conducts fundamental research in quantum physics but also explores how best to exploit and apply this disruptive technology to enable NASA missions in aeronautics, Earth and space sciences, and space exploration. At the same time, machine learning has become a major focus in computer science and captured the imagination of the public as a panacea to myriad big data problems. In this talk, we will discuss how classical machine learning can take advantage of quantum computing to significantly improve its effectiveness. Although we illustrate this concept on a quantum annealer, other quantum platforms could be used as well. If explored fully and implemented efficiently, quantum machine learning could greatly accelerate a wide range of tasks leading to new technologies and discoveries that will significantly change the way we solve real-world problems.
Federated data storage and management infrastructure
NASA Astrophysics Data System (ADS)
Zarochentsev, A.; Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Hristov, P.
2016-10-01
The Large Hadron Collider (LHC)’ operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. Computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitude; it will require new approaches in data storage organization and data handling. In our project we address the fundamental problem of designing of architecture to integrate a distributed heterogeneous disk resources for LHC experiments and other data- intensive science applications and to provide access to data from heterogeneous computing facilities. We have prototyped a federated storage for Russian T1 and T2 centers located in Moscow, St.-Petersburg and Gatchina, as well as Russian / CERN federation. We have conducted extensive tests of underlying network infrastructure and storage endpoints with synthetic performance measurement tools as well as with HENP-specific workloads, including the ones running on supercomputing platform, cloud computing and Grid for ALICE and ATLAS experiments. We will present our current accomplishments with running LHC data analysis remotely and locally to demonstrate our ability to efficiently use federated data storage experiment wide within National Academic facilities for High Energy and Nuclear Physics as well as for other data-intensive science applications, such as bio-informatics.
OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2005-01-01
Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.
Books and monographs on finite element technology
NASA Technical Reports Server (NTRS)
Noor, A. K.
1985-01-01
The present paper proviees a listing of all of the English books and some of the foreign books on finite element technology, taking into account also a list of the conference proceedings devoted solely to finite elements. The references are divided into categories. Attention is given to fundamentals, mathematical foundations, structural and solid mechanics applications, fluid mechanics applications, other applied science and engineering applications, computer implementation and software systems, computational and modeling aspects, special topics, boundary element methods, proceedings of symmposia and conferences on finite element technology, bibliographies, handbooks, and historical accounts.
Crutchfield, James P; Ditto, William L; Sinha, Sudeshna
2010-09-01
How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.
Open-Phylo: a customizable crowd-computing platform for multiple sequence alignment
2013-01-01
Citizen science games such as Galaxy Zoo, Foldit, and Phylo aim to harness the intelligence and processing power generated by crowds of online gamers to solve scientific problems. However, the selection of the data to be analyzed through these games is under the exclusive control of the game designers, and so are the results produced by gamers. Here, we introduce Open-Phylo, a freely accessible crowd-computing platform that enables any scientist to enter our system and use crowds of gamers to assist computer programs in solving one of the most fundamental problems in genomics: the multiple sequence alignment problem. PMID:24148814
PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah
2009-12-01
In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less
Kassner, Michael E.; Nemat-Nasser, Sia; Suo, Zhigang; ...
2004-09-15
The Division of Materials Sciences and Engineering of the US Department of Energy (DOE) sponsored a workshop to identify cutting-edge research needs and opportunities, enabled by the application of theoretical and applied mechanics. The workshop also included input from biochemical, surface science, and computational disciplines, on approaching scientific issues at the nanoscale, and the linkage of atomistic-scale with nano-, meso-, and continuum-scale mechanics. This paper is a summary of the outcome of the workshop, consisting of three main sections, each put together by a team of workshop participants. Section 1 addresses research opportunities that can be realized by the applicationmore » of mechanics fundamentals to the general area of self-assembly, directed self-assembly, and fluidics. Section 2 examines the role of mechanics in biological, bioinspired, and biohybrid material systems, closely relating to and complementing the material covered in Section 1. In this manner, it was made clear that mechanics plays a fundamental role in understanding the biological functions at all scales, in seeking to utilize biology and biological techniques to develop new materials and devices, and in the general area of bionanotechnology. While direct observational investigations are an essential ingredient of new discoveries and will continue to open new exciting research doors, it is the basic need for controlled experimentation and fundamentally- based modeling and computational simulations that will be truly empowered by a systematic use of the fundamentals of mechanics. Section 3 brings into focus new challenging issues in inelastic deformation and fracturing of materials that have emerged as a result of the development of nanodevices, biopolymers, and hybrid bio–abio systems. As a result, each section begins with some introductory overview comments, and then provides illustrative examples that were presented at the workshop and which are believed to highlight the enabling research areas and, particularly, the impact that mechanics can make in enhancing the fundamental understanding that can lead to new technologies.« less
NASA Astrophysics Data System (ADS)
Gil, Y.; Zanzerkia, E. E.; Munoz-Avila, H.
2015-12-01
The National Science Foundation (NSF) Directorate for Geosciences (GEO) and Directorate for Computer and Information Science (CISE) acknowledge the significant scientific challenges required to understand the fundamental processes of the Earth system, within the atmospheric and geospace, Earth, ocean and polar sciences, and across those boundaries. A broad view of the opportunities and directions for GEO are described in the report "Dynamic Earth: GEO imperative and Frontiers 2015-2020." Many of the aspects of geosciences research, highlighted both in this document and other community grand challenges, pose novel problems for researchers in intelligent systems. Geosciences research will require solutions for data-intensive science, advanced computational capabilities, and transformative concepts for visualizing, using, analyzing and understanding geo phenomena and data. Opportunities for the scientific community to engage in addressing these challenges are available and being developed through NSF's portfolio of investments and activities. The NSF-wide initiative, Cyberinfrastructure Framework for 21st Century Science and Engineering (CIF21), looks to accelerate research and education through new capabilities in data, computation, software and other aspects of cyberinfrastructure. EarthCube, a joint program between GEO and the Advanced Cyberinfrastructure Division, aims to create a well-connected and facile environment to share data and knowledge in an open, transparent, and inclusive manner, thus accelerating our ability to understand and predict the Earth system. EarthCube's mission opens an opportunity for collaborative research on novel information systems enhancing and supporting geosciences research efforts. NSF encourages true, collaborative partnerships between scientists in computer sciences and the geosciences to meet these challenges.
Editorial. Festschrift on the occasion of Kurt Kremer's 60
NASA Astrophysics Data System (ADS)
Site, Luigi Delle; Deserno, Markus; Dünweg, Burkhard; Holm, Christian; Peter, Christine; Pleiner, Harald
2016-10-01
This special topics issue offers a broad perspective on recent theoretical and computational soft matter science, providing state of the art advances in many of its sub-fields. As is befitting for a discipline as diverse as soft matter, the papers collected here span a considerable range of subjects and questions, but they also illustrate numerous connections into both fundamental science and technological/industrial applications, which have accompanied the field since its earliest days. This issue is dedicated to Kurt Kremer, on the occasion of his 60th birthday, honouring his role in establishing this exciting field and consolidating its standing in the frame of current science and technology.
Benefits of Exchange Between Computer Scientists and Perceptual Scientists: A Panel Discussion
NASA Technical Reports Server (NTRS)
Kaiser, Mary K.; Null, Cynthia H. (Technical Monitor)
1995-01-01
We have established several major goals for this panel: 1) Introduce the computer graphics community to some specific leaders in the use of perceptual psychology relating to computer graphics; 2) Enumerate the major results that are known, and provide a set of resources for finding others; 3) Identify research areas where knowledge of perceptual psychology can help computer system designers improve their systems; and 4) Provide advice to researchers on how they can establish collaborations in their own research programs. We believe this will be a very important panel. In addition to generating lively discussion, we hope to point out some of the fundamental issues that occur at the boundary between computer science and perception, and possibly help researchers avoid some of the common pitfalls.
Laboratory Directed Research and Development Program FY 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen
2007-03-08
The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness.« less
Information visualisation for science and policy: engaging users and avoiding bias.
McInerny, Greg J; Chen, Min; Freeman, Robin; Gavaghan, David; Meyer, Miriah; Rowland, Francis; Spiegelhalter, David J; Stefaner, Moritz; Tessarolo, Geizi; Hortal, Joaquin
2014-03-01
Visualisations and graphics are fundamental to studying complex subject matter. However, beyond acknowledging this value, scientists and science-policy programmes rarely consider how visualisations can enable discovery, create engaging and robust reporting, or support online resources. Producing accessible and unbiased visualisations from complicated, uncertain data requires expertise and knowledge from science, policy, computing, and design. However, visualisation is rarely found in our scientific training, organisations, or collaborations. As new policy programmes develop [e.g., the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES)], we need information visualisation to permeate increasingly both the work of scientists and science policy. The alternative is increased potential for missed discoveries, miscommunications, and, at worst, creating a bias towards the research that is easiest to display. Copyright © 2014 Elsevier Ltd. All rights reserved.
On agent-based modeling and computational social science.
Conte, Rosaria; Paolucci, Mario
2014-01-01
In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS.
On agent-based modeling and computational social science
Conte, Rosaria; Paolucci, Mario
2014-01-01
In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael; Lethin, Richard
Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less
Maximal aggregation of polynomial dynamical systems
Cardelli, Luca; Tschaikowski, Max
2017-01-01
Ordinary differential equations (ODEs) with polynomial derivatives are a fundamental tool for understanding the dynamics of systems across many branches of science, but our ability to gain mechanistic insight and effectively conduct numerical evaluations is critically hindered when dealing with large models. Here we propose an aggregation technique that rests on two notions of equivalence relating ODE variables whenever they have the same solution (backward criterion) or if a self-consistent system can be written for describing the evolution of sums of variables in the same equivalence class (forward criterion). A key feature of our proposal is to encode a polynomial ODE system into a finitary structure akin to a formal chemical reaction network. This enables the development of a discrete algorithm to efficiently compute the largest equivalence, building on approaches rooted in computer science to minimize basic models of computation through iterative partition refinements. The physical interpretability of the aggregation is shown on polynomial ODE systems for biochemical reaction networks, gene regulatory networks, and evolutionary game theory. PMID:28878023
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Designer drugs: the evolving science of drug discovery.
Wanke, L A; DuBose, R F
1998-07-01
Drug discovery and design are fundamental to drug development. Until recently, most drugs were discovered through random screening or developed through molecular modification. New technologies are revolutionizing this phase of drug development. Rational drug design, using powerful computers and computational chemistry and employing X-ray crystallography, nuclear magnetic resonance spectroscopy, and three-dimensional quantitative structure activity relationship analysis, is creating highly specific, biologically active molecules by virtual reality modeling. Sophisticated screening technologies are eliminating all but the most active lead compounds. These new technologies promise more efficacious, safe, and cost-effective medications, while minimizing drug development time and maximizing profits.
Spurious Solutions Of Nonlinear Differential Equations
NASA Technical Reports Server (NTRS)
Yee, H. C.; Sweby, P. K.; Griffiths, D. F.
1992-01-01
Report utilizes nonlinear-dynamics approach to investigate possible sources of errors and slow convergence and non-convergence of steady-state numerical solutions when using time-dependent approach for problems containing nonlinear source terms. Emphasizes implications for development of algorithms in CFD and computational sciences in general. Main fundamental conclusion of study is that qualitative features of nonlinear differential equations cannot be adequately represented by finite-difference method and vice versa.
Fundamental movement skills and habitual physical activity in young children.
Fisher, Abigail; Reilly, John J; Kelly, Louise A; Montgomery, Colette; Williamson, Avril; Paton, James Y; Grant, Stan
2005-04-01
To test for relationships between objectively measured habitual physical activity and fundamental movement skills in a relatively large and representative sample of preschool children. Physical activity was measured over 6 d using the Computer Science and Applications (CSA) accelerometer in 394 boys and girls (mean age 4.2, SD 0.5 yr). Children were scored on 15 fundamental movement skills, based on the Movement Assessment Battery, by a single observer. Total physical activity (r=0.10, P<0.05) and percent time spent in moderate to vigorous physical activity (MVPA) (r=0.18, P<0.001) were significantly correlated with total movement skills score. Time spent in light-intensity physical activity was not significantly correlated with motor skills score (r=0.02, P>0.05). In this sample and setting, fundamental movement skills were significantly associated with habitual physical activity, but the association between the two variables was weak. The present study questions whether the widely assumed relationships between motor skills and habitual physical activity actually exist in young children.
Role of High-End Computing in Meeting NASA's Science and Engineering Challenges
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Tu, Eugene L.; Van Dalsem, William R.
2006-01-01
Two years ago, NASA was on the verge of dramatically increasing its HEC capability and capacity. With the 10,240-processor supercomputer, Columbia, now in production for 18 months, HEC has an even greater impact within the Agency and extending to partner institutions. Advanced science and engineering simulations in space exploration, shuttle operations, Earth sciences, and fundamental aeronautics research are occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. This talk describes how the integrated production environment fostered at the NASA Advanced Supercomputing (NAS) facility at Ames Research Center is accelerating scientific discovery, achieving parametric analyses of multiple scenarios, and enhancing safety for NASA missions. We focus on Columbia s impact on two key engineering and science disciplines: Aerospace, and Climate. We also discuss future mission challenges and plans for NASA s next-generation HEC environment.
Damsel: A Data Model Storage Library for Exascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Liao, Wei-keng
Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less
First-Principles Design of Novel Catalytic and Chemoresponsive Materials
NASA Astrophysics Data System (ADS)
Roling, Luke T.
An emerging trend in materials design is the use of computational chemistry tools to accelerate materials discovery and implementation. In particular, the parallel nature of computational models enables high-throughput screening approaches that would be laborious and time-consuming with experiments alone, and can be useful for identifying promising candidate materials for experimental synthesis and evaluation. Additionally, atomic-scale modeling allows researchers to obtain a detailed understanding of phenomena invisible to many current experimental techniques. In this thesis, we highlight mechanistic studies and successes in catalyst design for heterogeneous electrochemical reactions, discussing both anode and cathode chemistries. In particular, we evaluate the properties of a new class of Pd-Pt core-shell and hollow nanocatalysts toward the oxygen reduction reaction. We do not limit our study to electrochemical reactivity, but also consider these catalysts in a broader context by performing in-depth studies of their stability at elevated temperatures as well as investigating the mechanisms by which they are able to form. We also present fundamental surface science studies, investigating graphene formation and H2 dissociation, which are processes of both fundamental and practical interest in many catalytic applications. Finally, we extend our materials design paradigm outside the field of catalysis to develop and apply a model for the detection of small chemical analytes by chemoresponsive liquid crystals, and offer several predictions for improving the detection of small chemicals. A close connection between computation, synthesis, and experimental evaluation is essential to the work described herein, as computations are used to gain fundamental insight into experimental observations, and experiments and synthesis are in turn used to validate predictions of material activities from computational models.
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
Scientific Visualization and Computational Science: Natural Partners
NASA Technical Reports Server (NTRS)
Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)
1995-01-01
Scientific visualization is developing rapidly, stimulated by computational science, which is gaining acceptance as a third alternative to theory and experiment. Computational science is based on numerical simulations of mathematical models derived from theory. But each individual simulation is like a hypothetical experiment; initial conditions are specified, and the result is a record of the observed conditions. Experiments can be simulated for situations that can not really be created or controlled. Results impossible to measure can be computed.. Even for observable values, computed samples are typically much denser. Numerical simulations also extend scientific exploration where the mathematics is analytically intractable. Numerical simulations are used to study phenomena from subatomic to intergalactic scales and from abstract mathematical structures to pragmatic engineering of everyday objects. But computational science methods would be almost useless without visualization. The obvious reason is that the huge amounts of data produced require the high bandwidth of the human visual system, and interactivity adds to the power. Visualization systems also provide a single context for all the activities involved from debugging the simulations, to exploring the data, to communicating the results. Most of the presentations today have their roots in image processing, where the fundamental task is: Given an image, extract information about the scene. Visualization has developed from computer graphics, and the inverse task: Given a scene description, make an image. Visualization extends the graphics paradigm by expanding the possible input. The goal is still to produce images; the difficulty is that the input is not a scene description displayable by standard graphics methods. Visualization techniques must either transform the data into a scene description or extend graphics techniques to display this odd input. Computational science is a fertile field for visualization research because the results vary so widely and include things that have no known appearance. The amount of data creates additional challenges for both hardware and software systems. Evaluations of visualization should ultimately reflect the insight gained into the scientific phenomena. So making good visualizations requires consideration of characteristics of the user and the purpose of the visualization. Knowledge about human perception and graphic design is also relevant. It is this breadth of knowledge that stimulates proposals for multidisciplinary visualization teams and intelligent visualization assistant software. Visualization is an immature field, but computational science is stimulating research on a broad front.
Extending Landauer's bound from bit erasure to arbitrary computation
NASA Astrophysics Data System (ADS)
Wolpert, David
The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.
Negotiating the Traffic: Can Cognitive Science Help Make Autonomous Vehicles a Reality?
Chater, Nick; Misyak, Jennifer; Watson, Derrick; Griffiths, Nathan; Mouzakitis, Alex
2018-02-01
To drive safely among human drivers, cyclists and pedestrians, autonomous vehicles will need to mimic, or ideally improve upon, humanlike driving. Yet, driving presents us with difficult problems of joint action: 'negotiating' with other users over shared road space. We argue that autonomous driving provides a test case for computational theories of social interaction, with fundamental implications for the development of autonomous vehicles. Copyright © 2017 Elsevier Ltd. All rights reserved.
Turmezei, Tom D; Poole, Ken E S
2011-01-01
Bone is a fundamental component of the disordered joint homeostasis seen in osteoarthritis, a disease that has been primarily characterized by the breakdown of articular cartilage accompanied by local bone changes and a limited degree of joint inflammation. In this review we consider the role of computed tomography imaging and computational analysis in osteoarthritis research, focusing on subchondral bone and osteophytes in the hip. We relate what is already known in this area to what could be explored through this approach in the future in relation to both clinical research trials and the underlying cellular and molecular science of osteoarthritis. We also consider how this area of research could impact on our understanding of the genetics of osteoarthritis.
Distributed information system (water fact sheet)
Harbaugh, A.W.
1986-01-01
During 1982-85, the Water Resources Division (WRD) of the U.S. Geological Survey (USGS) installed over 70 large minicomputers in offices across the country to support its mission in the science of hydrology. These computers are connected by a communications network that allows information to be shared among computers in each office. The computers and network together are known as the Distributed Information System (DIS). The computers are accessed through the use of more than 1500 terminals and minicomputers. The WRD has three fundamentally different needs for computing: data management; hydrologic analysis; and administration. Data management accounts for 50% of the computational workload of WRD because hydrologic data are collected in all 50 states, Puerto Rico, and the Pacific trust territories. Hydrologic analysis consists of 40% of the computational workload of WRD. Cost accounting, payroll, personnel records, and planning for WRD programs occupies an estimated 10% of the computer workload. The DIS communications network is shown on a map. (Lantz-PTT)
Hafner, Jürgen
2010-09-29
During the last 20 years computer simulations based on a quantum-mechanical description of the interactions between electrons and atomic nuclei have developed an increasingly important impact on materials science, not only in promoting a deeper understanding of the fundamental physical phenomena, but also enabling the computer-assisted design of materials for future technologies. The backbone of atomic-scale computational materials science is density-functional theory (DFT) which allows us to cast the intractable complexity of electron-electron interactions into the form of an effective single-particle equation determined by the exchange-correlation functional. Progress in DFT-based calculations of the properties of materials and of simulations of processes in materials depends on: (1) the development of improved exchange-correlation functionals and advanced post-DFT methods and their implementation in highly efficient computer codes, (2) the development of methods allowing us to bridge the gaps in the temperature, pressure, time and length scales between the ab initio calculations and real-world experiments and (3) the extension of the functionality of these codes, permitting us to treat additional properties and new processes. In this paper we discuss the current status of techniques for performing quantum-based simulations on materials and present some illustrative examples of applications to complex quasiperiodic alloys, cluster-support interactions in microporous acid catalysts and magnetic nanostructures.
The emergence of mind and brain: an evolutionary, computational, and philosophical approach.
Mainzer, Klaus
2008-01-01
Modern philosophy of mind cannot be understood without recent developments in computer science, artificial intelligence (AI), robotics, neuroscience, biology, linguistics, and psychology. Classical philosophy of formal languages as well as symbolic AI assume that all kinds of knowledge must explicitly be represented by formal or programming languages. This assumption is limited by recent insights into the biology of evolution and developmental psychology of the human organism. Most of our knowledge is implicit and unconscious. It is not formally represented, but embodied knowledge, which is learnt by doing and understood by bodily interacting with changing environments. That is true not only for low-level skills, but even for high-level domains of categorization, language, and abstract thinking. The embodied mind is considered an emergent capacity of the brain as a self-organizing complex system. Actually, self-organization has been a successful strategy of evolution to handle the increasing complexity of the world. Genetic programs are not sufficient and cannot prepare the organism for all kinds of complex situations in the future. Self-organization and emergence are fundamental concepts in the theory of complex dynamical systems. They are also applied in organic computing as a recent research field of computer science. Therefore, cognitive science, AI, and robotics try to model the embodied mind in an artificial evolution. The paper analyzes these approaches in the interdisciplinary framework of complex dynamical systems and discusses their philosophical impact.
NASA Astrophysics Data System (ADS)
Mezzacappa, Anthony
2005-01-01
On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.
[Standards in Medical Informatics: Fundamentals and Applications].
Suárez-Obando, Fernando; Camacho Sánchez, Jhon
2013-09-01
The use of computers in medical practice has enabled novel forms of communication to be developed in health care. The optimization of communication processes is achieved through the use of standards to harmonize the exchange of information and provide a common language for all those involved. This article describes the concept of a standard applied to medical informatics and its importance in the development of various applications, such as computational representation of medical knowledge, disease classification and coding systems, medical literature searches and integration of biological and clinical sciences. Copyright © 2013 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Basic energy sciences: Summary of accomplishments
NASA Astrophysics Data System (ADS)
1990-05-01
For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.
Basic Energy Sciences: Summary of Accomplishments
DOE R&D Accomplishments Database
1990-05-01
For more than four decades, the Department of Energy, including its predecessor agencies, has supported a program of basic research in nuclear- and energy-related sciences, known as Basic Energy Sciences. The purpose of the program is to explore fundamental phenomena, create scientific knowledge, and provide unique user'' facilities necessary for conducting basic research. Its technical interests span the range of scientific disciplines: physical and biological sciences, geological sciences, engineering, mathematics, and computer sciences. Its products and facilities are essential to technology development in many of the more applied areas of the Department's energy, science, and national defense missions. The accomplishments of Basic Energy Sciences research are numerous and significant. Not only have they contributed to Departmental missions, but have aided significantly the development of technologies which now serve modern society daily in business, industry, science, and medicine. In a series of stories, this report highlights 22 accomplishments, selected because of their particularly noteworthy contributions to modern society. A full accounting of all the accomplishments would be voluminous. Detailed documentation of the research results can be found in many thousands of articles published in peer-reviewed technical literature.
Laboratory directed research and development program FY 1999
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Todd; Levy, Karin
2000-03-08
The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness. This is the annual report on Laboratory Directed Research and Development (LDRD) program for FY99.« less
Computational physics of the mind
NASA Astrophysics Data System (ADS)
Duch, Włodzisław
1996-08-01
In the XIX century and earlier physicists such as Newton, Mayer, Hooke, Helmholtz and Mach were actively engaged in the research on psychophysics, trying to relate psychological sensations to intensities of physical stimuli. Computational physics allows to simulate complex neural processes giving a chance to answer not only the original psychophysical questions but also to create models of the mind. In this paper several approaches relevant to modeling of the mind are outlined. Since direct modeling of the brain functions is rather limited due to the complexity of such models a number of approximations is introduced. The path from the brain, or computational neurosciences, to the mind, or cognitive sciences, is sketched, with emphasis on higher cognitive functions such as memory and consciousness. No fundamental problems in understanding of the mind seem to arise. From a computational point of view realistic models require massively parallel architectures.
NASA Astrophysics Data System (ADS)
Klopfer, Eric; Scheintaub, Hal; Huang, Wendy; Wendel, Daniel
Computational approaches to science are radically altering the nature of scientific investigatiogn. Yet these computer programs and simulations are sparsely used in science education, and when they are used, they are typically “canned” simulations which are black boxes to students. StarLogo The Next Generation (TNG) was developed to make programming of simulations more accessible for students and teachers. StarLogo TNG builds on the StarLogo tradition of agent-based modeling for students and teachers, with the added features of a graphical programming environment and a three-dimensional (3D) world. The graphical programming environment reduces the learning curve of programming, especially syntax. The 3D graphics make for a more immersive and engaging experience for students, including making it easy to design and program their own video games. Another change to StarLogo TNG is a fundamental restructuring of the virtual machine to make it more transparent. As a result of these changes, classroom use of TNG is expanding to new areas. This chapter is concluded with a description of field tests conducted in middle and high school science classes.
USDA-ARS?s Scientific Manuscript database
This study guide provides comments and references for professional soil scientists who are studying for the soil science fundamentals exam needed as the first step for certification. The performance objectives were determined by the Soil Science Society of America's Council of Soil Science Examiners...
Reconceptualizing the Nature of Science for Science Education: Why Does it Matter?
ERIC Educational Resources Information Center
Dagher, Zoubeida R.; Erduran, Sibel
2016-01-01
Two fundamental questions about science are relevant for science educators: (a) What is the nature of science? and (b) what aspects of nature of science should be taught and learned? They are fundamental because they pertain to how science gets to be framed as a school subject and determines what aspects of it are worthy of inclusion in school…
NASA Astrophysics Data System (ADS)
Assadi, Amir H.
2001-11-01
Perceptual geometry is an emerging field of interdisciplinary research whose objectives focus on study of geometry from the perspective of visual perception, and in turn, apply such geometric findings to the ecological study of vision. Perceptual geometry attempts to answer fundamental questions in perception of form and representation of space through synthesis of cognitive and biological theories of visual perception with geometric theories of the physical world. Perception of form and space are among fundamental problems in vision science. In recent cognitive and computational models of human perception, natural scenes are used systematically as preferred visual stimuli. Among key problems in perception of form and space, we have examined perception of geometry of natural surfaces and curves, e.g. as in the observer's environment. Besides a systematic mathematical foundation for a remarkably general framework, the advantages of the Gestalt theory of natural surfaces include a concrete computational approach to simulate or recreate images whose geometric invariants and quantities might be perceived and estimated by an observer. The latter is at the very foundation of understanding the nature of perception of space and form, and the (computer graphics) problem of rendering scenes to visually invoke virtual presence.
Grid Computing for Earth Science
NASA Astrophysics Data System (ADS)
Renard, Philippe; Badoux, Vincent; Petitdidier, Monique; Cossu, Roberto
2009-04-01
The fundamental challenges facing humankind at the beginning of the 21st century require an effective response to the massive changes that are putting increasing pressure on the environment and society. The worldwide Earth science community, with its mosaic of disciplines and players (academia, industry, national surveys, international organizations, and so forth), provides a scientific basis for addressing issues such as the development of new energy resources; a secure water supply; safe storage of nuclear waste; the analysis, modeling, and mitigation of climate changes; and the assessment of natural and industrial risks. In addition, the Earth science community provides short- and medium-term prediction of weather and natural hazards in real time, and model simulations of a host of phenomena relating to the Earth and its space environment. These capabilities require that the Earth science community utilize, both in real and remote time, massive amounts of data, which are usually distributed among many different organizations and data centers.
Semivariogram modeling by weighted least squares
Jian, X.; Olea, R.A.; Yu, Y.-S.
1996-01-01
Permissible semivariogram models are fundamental for geostatistical estimation and simulation of attributes having a continuous spatiotemporal variation. The usual practice is to fit those models manually to experimental semivariograms. Fitting by weighted least squares produces comparable results to fitting manually in less time, systematically, and provides an Akaike information criterion for the proper comparison of alternative models. We illustrate the application of a computer program with examples showing the fitting of simple and nested models. Copyright ?? 1996 Elsevier Science Ltd.
NASA Technical Reports Server (NTRS)
Davis, Bruce E.; Elliot, Gregory
1989-01-01
Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.
NASA Astrophysics Data System (ADS)
Mansbach, Rachael; Ferguson, Andrew
Self-assembling π-conjugated peptides are attractive candidates for the fabrication of bioelectronic materials possessing optoelectronic properties due to electron delocalization over the conjugated peptide groups. We present a computational and theoretical study of an experimentally-realized optoelectronic peptide that displays triggerable assembly in low pH to resolve the microscopic effects of flow and pH on the non-equilibrium morphology and kinetics of assembly. Using a combination of molecular dynamics simulations and hydrodynamic modeling, we quantify the time and length scales at which convective flows employed in directed assembly compete with microscopic diffusion to influence assembly. We also show that there is a critical pH below which aggregation proceeds irreversibly, and quantify the relationship between pH, charge density, and aggregate size. Our work provides new fundamental understanding of pH and flow of non-equilibrium π-conjugated peptide assembly, and lays the groundwork for the rational manipulation of environmental conditions and peptide chemistry to control assembly and the attendant emergent optoelectronic properties. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, under Award # DE-SC0011847, and by the Computational Science and Engineering Fellowship from the University of Illinois at Urbana-Champaign.
Mannava, Sandeep; Plate, Johannes F; Tuohy, Christopher J; Seyler, Thorsten M; Whitlock, Patrick W; Curl, Walton W; Smith, Thomas L; Saul, Katherine R
2013-07-01
The purpose of this article is to review basic science studies using various animal models for rotator cuff research and to describe structural, biomechanical, and functional changes to muscle following rotator cuff tears. The use of computational simulations to translate the findings from animal models to human scale is further detailed. A comprehensive review was performed of the basic science literature describing the use of animal models and simulation analysis to examine muscle function following rotator cuff injury and repair in the ageing population. The findings from various studies of rotator cuff pathology emphasize the importance of preventing permanent muscular changes with detrimental results. In vivo muscle function, electromyography, and passive muscle-tendon unit properties were studied before and after supraspinatus tenotomy in a rodent rotator cuff injury model (acute vs chronic). Then, a series of simulation experiments were conducted using a validated computational human musculoskeletal shoulder model to assess both passive and active tension of rotator cuff repairs based on surgical positioning. Outcomes of rotator cuff repair may be improved by earlier surgical intervention, with lower surgical repair tensions and fewer electromyographic neuromuscular changes. An integrated approach of animal experiments, computer simulation analyses, and clinical studies may allow us to gain a fundamental understanding of the underlying pathology and interpret the results for clinical translation.
Markowitz, Dina G; DuPré, Michael J
2007-01-01
The University of Rochester's Graduate Experience in Science Education (GESE) course familiarizes biomedical science graduate students interested in pursuing academic career tracks with a fundamental understanding of some of the theory, principles, and concepts of science education. This one-semester elective course provides graduate students with practical teaching and communication skills to help them better relate science content to, and increase their confidence in, their own teaching abilities. The 2-h weekly sessions include an introduction to cognitive hierarchies, learning styles, and multiple intelligences; modeling and coaching some practical aspects of science education pedagogy; lesson-planning skills; an introduction to instructional methods such as case studies and problem-based learning; and use of computer-based instructional technologies. It is hoped that the early development of knowledge and skills about teaching and learning will encourage graduate students to continue their growth as educators throughout their careers. This article summarizes the GESE course and presents evidence on the effectiveness of this course in providing graduate students with information about teaching and learning that they will use throughout their careers.
DuPré, Michael J.
2007-01-01
The University of Rochester's Graduate Experience in Science Education (GESE) course familiarizes biomedical science graduate students interested in pursuing academic career tracks with a fundamental understanding of some of the theory, principles, and concepts of science education. This one-semester elective course provides graduate students with practical teaching and communication skills to help them better relate science content to, and increase their confidence in, their own teaching abilities. The 2-h weekly sessions include an introduction to cognitive hierarchies, learning styles, and multiple intelligences; modeling and coaching some practical aspects of science education pedagogy; lesson-planning skills; an introduction to instructional methods such as case studies and problem-based learning; and use of computer-based instructional technologies. It is hoped that the early development of knowledge and skills about teaching and learning will encourage graduate students to continue their growth as educators throughout their careers. This article summarizes the GESE course and presents evidence on the effectiveness of this course in providing graduate students with information about teaching and learning that they will use throughout their careers. PMID:17785406
Fundamental care and knowledge interests: Implications for nursing science.
Granero-Molina, José; Fernández-Sola, Cayetano; Mateo-Aguilar, Ester; Aranda-Torres, Cayetano; Román-López, Pablo; Hernández-Padilla, José Manuel
2018-06-01
To characterise the intratheoretical interests of knowledge in nursing science as an epistemological framework for fundamental care. For Jürgen Habermas, theory does not separate knowledge interests from life. All knowledge, understanding and human research is always interested. Habermas formulated the knowledge interests in empirical-analytical, historical-hermeneutic and critical social sciences; but said nothing about health sciences and nursing science. Discursive paper. The article is organised into five sections that develop our argument about the implications of the Habermasian intratheoretical interests in nursing science and fundamental care: the persistence of a technical interest, the predominance of a practical interest, the importance of an emancipatory interest, "being there" to understand individuals' experience and an "existential crisis" that uncovers the individual's subjectivity. The nursing discipline can take on practical and emancipatory interests (together with a technical interest) as its fundamental knowledge interests. Nurses' privileged position in the delivery of fundamental care gives them the opportunity to gain a deep understanding of the patient's experience and illness process through physical contact and empathic communication. In clinical, academic and research environments, nurses should highlight the importance of fundamental care, showcasing the value of practical and emancipatory knowledge. This process could help to improve nursing science's leadership, social visibility and idiosyncrasy. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Kortenkamp, S.; Baldridge, A. M.; Bleamaster, L. F.; Buxner, S.; Canizo, T.; Crown, D. A.; Lebofsky, L. A.
2012-12-01
The Planetary Science Institute (PSI), in partnership with the Tucson Regional Science Center, offers a series of professional development workshops targeting K-8 science teachers in southern Arizona. Using NASA data sets, research results, and a team of PSI scientists and educators, our workshops provide teachers with in-depth content knowledge of fundamental concepts in astronomy, geology, and planetary science. Current workshops are: The Earth-Moon System, Exploring the Terrestrial Planets, Impact Cratering, The Asteroid-Meteorite Connection, Volcanoes of the Solar System, Deserts of the Solar System, and Astrobiology and the Search for Extrasolar Planets. Several workshops incorporate customized computer visualizations developed at PSI. These visualizations are designed to help teachers overcome the common misconceptions students have in fundamental areas of space science. For example, the simple geometric relationship between the sun, the moon, and Earth is a concept that is rife with misconceptions. How can the arrangement of these objects account for the constantly changing phases of the moon as well as the occasional eclipses of the sun and moon? Students at all levels often struggle to understand the explanation for phases and eclipses even after repeated instruction over many years. Traditional classroom techniques have proven to be insufficient at rooting out entrenched misconceptions. One problem stems from the difficulty of developing an accurate mental picture of the Earth-Moon system in space when a student's perspective has always been firmly planted on the ground. To address this problem our visualizations take the viewers on a journey beyond Earth, giving them a so-called "god's eye" view of how the Earth-Moon system would look from a distance. To make this journey as realistic as possible we use ray-tracing software, incorporate NASA mission images, and accurately portray rotational and orbital motion. During a workshop our visualizations are used in conjunction with more traditional classroom techniques. This combination instills a greater confidence in teachers' understanding of the concepts and therefore increases their ability to teach their students. To date we have produced over 100 unique visualizations to demonstrate many different fundamental concepts in the Earth and space sciences. Participants in each workshop are provided with digital copies of the visualizations in a variety of file formats. They also receive Keynote and PowerPoint templates pre-embedded with the visualizations to facility straightforward use on Macs or PCs in their classrooms. A measure of the success of PSI's workshops is that nearly 50% of our teachers have attended multiple workshops, and teachers often cite the visualizations as one of the top benefits of their experience. Details of our workshops as well as downloadable examples of some visualizations can be found at: www.psi.edu/epo. This work is supported by NASA EPOESS award NNX10AE56G: Workshops in Science Education and Resources (WISER): Planetary Perspectives.
Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuller, Ivan K.; Stevens, Rick; Pino, Robinson
2015-10-29
Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS basedmore » technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.« less
PREFACE: International Conference on Applied Sciences 2015 (ICAS2015)
NASA Astrophysics Data System (ADS)
Lemle, Ludovic Dan; Jiang, Yiwen
2016-02-01
The International Conference on Applied Sciences ICAS2015 took place in Wuhan, China on June 3-5, 2015 at the Military Economics Academy of Wuhan. The conference is regularly organized, alternatively in Romania and in P.R. China, by Politehnica University of Timişoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the joint aims to serve as a platform for exchange of information between various areas of applied sciences, and to promote the communication between the scientists of different nations, countries and continents. The topics of the conference cover a comprehensive spectrum of issues from: >Economical Sciences and Defense: Management Sciences, Business Management, Financial Management, Logistics, Human Resources, Crisis Management, Risk Management, Quality Control, Analysis and Prediction, Government Expenditure, Computational Methods in Economics, Military Sciences, National Security, and others... >Fundamental Sciences and Engineering: Interdisciplinary applications of physics, Numerical approximation and analysis, Computational Methods in Engineering, Metallic Materials, Composite Materials, Metal Alloys, Metallurgy, Heat Transfer, Mechanical Engineering, Mechatronics, Reliability, Electrical Engineering, Circuits and Systems, Signal Processing, Software Engineering, Data Bases, Modeling and Simulation, and others... The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has applicability potential in Engineering, Economics, Defense, etc. The number of participants was 120 from 11 countries (China, Romania, Taiwan, Korea, Denmark, France, Italy, Spain, USA, Jamaica, and Bosnia and Herzegovina). During the three days of the conference four invited and 67 oral talks were delivered. Based on the work presented at the conference, 38 selected papers have been included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computers Engineering, and Electrical Engineering. It's our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in the respective fields.
Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA
Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.
2017-01-01
The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099
Optimized Materials From First Principles Simulations: Are We There Yet?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galli, G; Gygi, F
2005-07-26
In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less
Human white matter and knowledge representation
2018-01-01
Understanding how knowledge is represented in the human brain is a fundamental challenge in neuroscience. To date, most of the work on this topic has focused on knowledge representation in cortical areas and debated whether knowledge is represented in a distributed or localized fashion. Fang and colleagues provide evidence that brain connections and the white matter supporting such connections might play a significant role. The work opens new avenues of investigation, breaking through disciplinary boundaries across network neuroscience, computational neuroscience, cognitive science, and classical lesion studies. PMID:29698391
Human white matter and knowledge representation.
Pestilli, Franco
2018-04-01
Understanding how knowledge is represented in the human brain is a fundamental challenge in neuroscience. To date, most of the work on this topic has focused on knowledge representation in cortical areas and debated whether knowledge is represented in a distributed or localized fashion. Fang and colleagues provide evidence that brain connections and the white matter supporting such connections might play a significant role. The work opens new avenues of investigation, breaking through disciplinary boundaries across network neuroscience, computational neuroscience, cognitive science, and classical lesion studies.
NASA aeronautics R&T - A resource for aircraft design
NASA Technical Reports Server (NTRS)
Olstad, W. B.
1981-01-01
This paper discusses the NASA aeronautics research and technology program from the viewpoint of the aircraft designer. The program spans the range from fundamental research to the joint validation with industry of technology for application into product development. Examples of recent developments in structures, materials, aerodynamics, controls, propulsion systems, and safety technology are presented as new additions to the designer's handbook. Finally, the major thrusts of NASA's current and planned programs which are keyed to revolutionary advances in materials science, electronics, and computer technology are addressed.
NASA Astrophysics Data System (ADS)
Chang, S. S. L.
State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.
NASA Technical Reports Server (NTRS)
Brooks, Rodney Allen; Stein, Lynn Andrea
1994-01-01
We describe a project to capitalize on newly available levels of computational resources in order to understand human cognition. We will build an integrated physical system including vision, sound input and output, and dextrous manipulation, all controlled by a continuously operating large scale parallel MIMD computer. The resulting system will learn to 'think' by building on its bodily experiences to accomplish progressively more abstract tasks. Past experience suggests that in attempting to build such an integrated system we will have to fundamentally change the way artificial intelligence, cognitive science, linguistics, and philosophy think about the organization of intelligence. We expect to be able to better reconcile the theories that will be developed with current work in neuroscience.
Thermodynamically consistent data-driven computational mechanics
NASA Astrophysics Data System (ADS)
González, David; Chinesta, Francisco; Cueto, Elías
2018-05-01
In the paradigm of data-intensive science, automated, unsupervised discovering of governing equations for a given physical phenomenon has attracted a lot of attention in several branches of applied sciences. In this work, we propose a method able to avoid the identification of the constitutive equations of complex systems and rather work in a purely numerical manner by employing experimental data. In sharp contrast to most existing techniques, this method does not rely on the assumption on any particular form for the model (other than some fundamental restrictions placed by classical physics such as the second law of thermodynamics, for instance) nor forces the algorithm to find among a predefined set of operators those whose predictions fit best to the available data. Instead, the method is able to identify both the Hamiltonian (conservative) and dissipative parts of the dynamics while satisfying fundamental laws such as energy conservation or positive production of entropy, for instance. The proposed method is tested against some examples of discrete as well as continuum mechanics, whose accurate results demonstrate the validity of the proposed approach.
NASA Astrophysics Data System (ADS)
Aharonov, Dorit
In the last few years, theoretical study of quantum systems serving as computational devices has achieved tremendous progress. We now have strong theoretical evidence that quantum computers, if built, might be used as a dramatically powerful computational tool, capable of performing tasks which seem intractable for classical computers. This review is about to tell the story of theoretical quantum computation. I l out the developing topic of experimental realizations of the model, and neglected other closely related topics which are quantum information and quantum communication. As a result of narrowing the scope of this paper, I hope it has gained the benefit of being an almost self contained introduction to the exciting field of quantum computation. The review begins with background on theoretical computer science, Turing machines and Boolean circuits. In light of these models, I define quantum computers, and discuss the issue of universal quantum gates. Quantum algorithms, including Shor's factorization algorithm and Grover's algorithm for searching databases, are explained. I will devote much attention to understanding what the origins of the quantum computational power are, and what the limits of this power are. Finally, I describe the recent theoretical results which show that quantum computers maintain their complexity power even in the presence of noise, inaccuracies and finite precision. This question cannot be separated from that of quantum complexity because any realistic model will inevitably be subjected to such inaccuracies. I tried to put all results in their context, asking what the implications to other issues in computer science and physics are. In the end of this review, I make these connections explicit by discussing the possible implications of quantum computation on fundamental physical questions such as the transition from quantum to classical physics.
Efficient Memory Access with NumPy Global Arrays using Local Memory Access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daily, Jeffrey A.; Berghofer, Dan C.
This paper discusses the work completed working with Global Arrays of data on distributed multi-computer systems and improving their performance. The tasks completed were done at Pacific Northwest National Laboratory in the Science Undergrad Laboratory Internship program in the summer of 2013 for the Data Intensive Computing Group in the Fundamental and Computational Sciences DIrectorate. This work was done on the Global Arrays Toolkit developed by this group. This toolkit is an interface for programmers to more easily create arrays of data on networks of computers. This is useful because scientific computation is often done on large amounts of datamore » sometimes so large that individual computers cannot hold all of it. This data is held in array form and can best be processed on supercomputers which often consist of a network of individual computers doing their computation in parallel. One major challenge for this sort of programming is that operations on arrays on multiple computers is very complex and an interface is needed so that these arrays seem like they are on a single computer. This is what global arrays does. The work done here is to use more efficient operations on that data that requires less copying of data to be completed. This saves a lot of time because copying data on many different computers is time intensive. The way this challenge was solved is when data to be operated on with binary operations are on the same computer, they are not copied when they are accessed. When they are on separate computers, only one set is copied when accessed. This saves time because of less copying done although more data access operations were done.« less
Future fundamental combustion research for aeropropulsion systems
NASA Technical Reports Server (NTRS)
Mularz, E. J.
1985-01-01
Physical fluid mechanics, heat transfer, and chemical kinetic processes which occur in the combustion chamber of aeropropulsion systems were investigated. With the component requirements becoming more severe for future engines, the current design methodology needs the new tools to obtain the optimum configuration in a reasonable design and development cycle. Research efforts in the last few years were encouraging but to achieve these benefits research is required into the fundamental aerothermodynamic processes of combustion. It is recommended that research continues in the areas of flame stabilization, combustor aerodynamics, heat transfer, multiphase flow and atomization, turbulent reacting flows, and chemical kinetics. Associated with each of these engineering sciences is the need for research into computational methods to accurately describe and predict these complex physical processes. Research needs in each of these areas are highlighted.
BioSIGHT: Interactive Visualization Modules for Science Education
NASA Technical Reports Server (NTRS)
Wong, Wee Ling
1998-01-01
Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high-speed network capabilities. The BioSIGHT project at is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches toward the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students.
Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System
NASA Technical Reports Server (NTRS)
Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana
2011-01-01
The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.
Laboratory Directed Research and Development Program FY 2008 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
editor, Todd C Hansen
2009-02-23
The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness. Berkeley Lab's research and the Laboratory Directed Research and Development (LDRD) program support DOE's Strategic Themes that are codified in DOE's 2006 Strategic Plan (DOE/CF-0010), with a primary focus on Scientific Discovery and Innovation. For that strategic theme, the Fiscal Year (FY) 2008 LDRD projects support each one of the three goals through multiple strategies described in the plan. In addition, LDRD efforts support the four goals of Energy Security, the two goals of Environmental Responsibility, and Nuclear Security (unclassified fundamental research that supports stockpile safety and nonproliferation programs). The LDRD program supports Office of Science strategic plans, including the 20-year Scientific Facilities Plan and the Office of Science Strategic Plan. The research also supports the strategic directions periodically under consideration and review by the Office of Science Program Offices, such as LDRD projects germane to new research facility concepts and new fundamental science directions. Berkeley Lab LDRD program also play an important role in leveraging DOE capabilities for national needs. The fundamental scientific research and development conducted in the program advances the skills and technologies of importance to our Work For Others (WFO) sponsors. Among many directions, these include a broad range of health-related science and technology of interest to the National Institutes of Health, breast cancer and accelerator research supported by the Department of Defense, detector technologies that should be useful to the Department of Homeland Security, and particle detection that will be valuable to the Environmental Protection Agency. The Berkeley Lab Laboratory Directed Research and Development Program FY2008 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the supported projects and summarizes their accomplishments. It constitutes a part of the LDRD program planning and documentation process that includes an annual planning cycle, project selection, implementation, and review.« less
Your Higgs number - how fundamental physics is connected to technology and societal revolutions
NASA Astrophysics Data System (ADS)
Lidström, Suzy; Allen, Roland E.
2015-03-01
Fundamental physics, as exemplified by the recently discovered Higgs boson, often appears to be completely disconnected from practical applications and ordinary human life. But this is not really the case, because science, technology, and human affairs are profoundly integrated in ways that are not immediately obvious. We illustrate this by defining a ``Higgs number'' through overlapping activities. Following three different paths, which end respectively in applications of the World Wide Web, digital photography, and modern electronic devices, we find that most people have a Higgs number of no greater than 3. Specific examples chosen for illustration, with their assigned Higgs numbers, are: LHC experimentalists employing the Worldwide Computing Grid (0) - Timothy Berners-Lee (1) - Marissa Mayer, of Google and Yahoo, and Sheryl Sandberg, of Facebook (2) - users of all web-based enterprises (3). CMS and ATLAS experimentalists (0) - particle detector developers (1) - inventors of CCDs and active-pixel sensors (2) - users of digital cameras and camcorders (3). Philip Anderson (0) - John Bardeen (1) - Jack Kilby (2) - users of personal computers, mobile phones, and all other modern electronic devices (3).
Fundamental Science with Pulsed Power: Research Opportunities and User Meeting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattsson, Thomas Kjell Rene; Wootton, Alan James; Sinars, Daniel Brian
The fifth Fundamental Science with Pulsed Power: Research Opportunities and User Meeting was held in Albuquerque, NM, July 20-23, 2014. The purpose of the workshop was to bring together leading scientists in four research areas with active fundamental science research at Sandia’s Z facility: Magnetized Liner Inertial Fusion (MagLIF), Planetary Science, Astrophysics, and Material Science. The workshop was focused on discussing opportunities for high-impact research using Sandia’s Z machine, a future 100 GPa class facility, and possible topics for growing the academic (off-Z-campus) science relevant to the Z Fundamental Science Program (ZFSP) and related projects in astrophysics, planetary science, MagLIF-more » relevant magnetized HED science, and materials science. The user meeting was for Z collaborative users to: a) hear about the Z accelerator facility status and plans, b) present the status of their research, and c) be provided with a venue to meet and work as groups. Following presentations by Mark Herrmann and Joel Lash on the fundamental science program on Z and the status of the Z facility where plenary sessions for the four research areas. The third day of the workshop was devoted to breakout sessions in the four research areas. The plenary- and breakout sessions were for the four areas organized by Dan Sinars (MagLIF), Dylan Spaulding (Planetary Science), Don Winget and Jim Bailey (Astrophysics), and Thomas Mattsson (Material Science). Concluding the workshop were an outbrief session where the leads presented a summary of the discussions in each working group to the full workshop. A summary of discussions and conclusions from each of the research areas follows and the outbrief slides are included as appendices.« less
A novel computational approach towards the certification of large-scale boson sampling
NASA Astrophysics Data System (ADS)
Huh, Joonsuk
Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.
Expertise transfer for expert system design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boose, J.H.
This book is about the Expertise Transfer System-a computer program which interviews experts and helps them build expert systems, i.e. computer programs that use knowledge from experts to make decisions and judgements under conditions of uncertainty. The techniques are useful to anyone who uses decision-making information based on the expertise of others. The methods can also be applied to personal decision-making. The interviewing methodology is borrowed from a branch of psychology called Personal Construct Theory. It is not necessary to use a computer to take advantage of the techniques from Personal Construction Theory; the fundamental procedures used by the Expertisemore » Transfer System can be performed using paper and pencil. It is not necessary that the reader understand very much about computers to understand the ideas in this book. The few relevant concepts from computer science and expert systems that are needed are explained in a straightforward manner. Ideas from Personal Construct Psychology are also introduced as needed.« less
Applications of Derandomization Theory in Coding
NASA Astrophysics Data System (ADS)
Cheraghchi, Mahdi
2011-07-01
Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.
Post-Cold War Science and Technology at Los Alamos
NASA Astrophysics Data System (ADS)
Browne, John C.
2002-04-01
Los Alamos National Laboratory serves the nation through the development and application of leading-edge science and technology in support of national security. Our mission supports national security by: ensuring the safety, security, and reliability of the U.S. nuclear stockpile; reducing the threat of weapons of mass destruction in support of counter terrorism and homeland defense; and solving national energy, environment, infrastructure, and health security problems. We require crosscutting fundamental and advanced science and technology research to accomplish our mission. The Stockpile Stewardship Program develops and applies, advanced experimental science, computational simulation, and technology to ensure the safety and reliability of U.S. nuclear weapons in the absence of nuclear testing. This effort in itself is a grand challenge. However, the terrorist attack of September 11, 2001, reminded us of the importance of robust and vibrant research and development capabilities to meet new and evolving threats to our national security. Today through rapid prototyping we are applying new, innovative, science and technology for homeland defense, to address the threats of nuclear, chemical, and biological weapons globally. Synergistically, with the capabilities that we require for our core mission, we contribute in many other areas of scientific endeavor. For example, our Laboratory has been part of the NASA effort on mapping water on the moon and NSF/DOE projects studying high-energy astrophysical phenomena, understanding fundamental scaling phenomena of life, exploring high-temperature superconductors, investigating quantum information systems, applying neutrons to condensed-matter and nuclear physics research, developing large-scale modeling and simulations to understand complex phenomena, and exploring nanoscience that bridges the atomic to macroscopic scales. In this presentation, I will highlight some of these post-cold war science and technology advances including our national security contributions, and discuss some of challenges for Los Alamos in the future.
NASA Astrophysics Data System (ADS)
Walton, A. L.
2015-12-01
In 2016, the National Science Foundation (NSF) will support a portfolio of activities and investments focused upon challenges in data access, interoperability, and sustainability. These topics are fundamental to science questions of increasing complexity that require multidisciplinary approaches and expertise. Progress has become tractable because of (and sometimes complicated by) unprecedented growth in data (both simulations and observations) and rapid advances in technology (such as instrumentation in all aspects of the discovery process, together with ubiquitous cyberinfrastructure to connect, compute, visualize, store, and discover). The goal is an evolution of capabilities for the research community based on these investments, scientific priorities, technology advances, and policies. Examples from multiple NSF directorates, including investments by the Advanced Cyberinfrastructure Division, are aimed at these challenges and can provide the geosciences research community with models and opportunities for participation. Implications for the future are highlighted, along with the importance of continued community engagement on key issues.
NASA Astrophysics Data System (ADS)
Robbins, Dennis; Ford, K. E. Saavik
2018-01-01
The NSF-supported “AstroCom NYC” program, a collaboration of the City University of New York and the American Museum of Natural History (AMNH), has developed and offers hands-on workshops to undergraduate faculty on teaching science thought and practices. These professional development workshops emphasize a curriculum and pedagogical strategies that uses computers and other digital devices in a laboratory environment to teach students fundamental topics, including: proportional reasoning, control of variables thinking, experimental design, hypothesis testing, reasoning with data, and drawing conclusions from graphical displays. Topics addressed here are rarely taught in-depth during the formal undergraduate years and are frequently learned only after several apprenticeship research experiences. The goal of these workshops is to provide working and future faculty with an interactive experience in science learning and teaching using modern technological tools.
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
NASA Astrophysics Data System (ADS)
Burov, Alexey
Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?
Quantum Opportunities and Challenges for Fundamental Sciences in Space
NASA Technical Reports Server (NTRS)
Yu, Nan
2012-01-01
Space platforms offer unique environment for and measurements of quantum world and fundamental physics. Quantum technology and measurements enhance measurement capabilities in space and result in greater science returns.
A Communication-Optimal Framework for Contracting Distributed Tensors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajbhandari, Samyam; NIkam, Akshay; Lai, Pai-Wei
Tensor contractions are extremely compute intensive generalized matrix multiplication operations encountered in many computational science fields, such as quantum chemistry and nuclear physics. Unlike distributed matrix multiplication, which has been extensively studied, limited work has been done in understanding distributed tensor contractions. In this paper, we characterize distributed tensor contraction algorithms on torus networks. We develop a framework with three fundamental communication operators to generate communication-efficient contraction algorithms for arbitrary tensor contractions. We show that for a given amount of memory per processor, our framework is communication optimal for all tensor contractions. We demonstrate performance and scalability of our frameworkmore » on up to 262,144 cores of BG/Q supercomputer using five tensor contraction examples.« less
Scully, John R
2015-01-01
Recent advances in characterization tools, computational capabilities, and theories have created opportunities for advancement in understanding of solid-fluid interfaces at the nanoscale in corroding metallic systems. The Faraday Discussion on Corrosion Chemistry in 2015 highlighted some of the current needs, gaps and opportunities in corrosion science. Themes were organized into several hierarchical categories that provide an organizational framework for corrosion. Opportunities to develop fundamental physical and chemical data which will enable further progress in thermodynamic and kinetic modelling of corrosion were discussed. These will enable new and better understanding of unit processes that govern corrosion at the nanoscale. Additional topics discussed included scales, films and oxides, fluid-surface and molecular-surface interactions, selected topics in corrosion science and engineering as well as corrosion control. Corrosion science and engineering topics included complex alloy dissolution, local corrosion, and modelling of specific corrosion processes that are made up of collections of temporally and spatially varying unit processes such as oxidation, ion transport, and competitive adsorption. Corrosion control and mitigation topics covered some new insights on coatings and inhibitors. Further advances in operando or in situ experimental characterization strategies at the nanoscale combined with computational modelling will enhance progress in the field, especially if coupling across length and time scales can be achieved incorporating the various phenomena encountered in corrosion. Readers are encouraged to not only to use this ad hoc organizational scheme to guide their immersion into the current opportunities in corrosion chemistry, but also to find value in the information presented in their own ways.
Modeling & Simulation Education for the Acquisition and T&E Workforce: FY07 Deliverable Package
2007-12-01
oceanography, meteorology, and near- earth space science) to represent how systems interact with and are influenced by their environment. E12.1 E12.2 E12.3 E12.4...fundamentals of terrestrial science (geology, oceanography, meteorology, and near- earth space science) to represent how systems interact with and...description: Describe the fundamentals of terrestrial science (geology, oceanography, meteorology, and near- earth space science) to represent how systems
Silk-Its Mysteries, How It Is Made, and How It Is Used.
Ebrahimi, Davoud; Tokareva, Olena; Rim, Nae Gyune; Wong, Joyce Y; Kaplan, David L; Buehler, Markus J
2015-10-12
This article reviews fundamental and applied aspects of silk-one of Nature's most intriguing materials in terms of its strength, toughness, and biological role-in its various forms, from protein molecules to webs and cocoons, in the context of mechanical and biological properties. A central question that will be explored is how the bridging of scales and the emergence of hierarchical structures are critical elements in achieving novel material properties, and how this knowledge can be explored in the design of synthetic materials. We review how the function of a material system at the macroscale can be derived from the interplay of fundamental molecular building blocks. Moreover, guidelines and approaches to current experimental and computational designs in the field of synthetic silklike materials are provided to assist the materials science community in engineering customized finetuned biomaterials for biomedical applications.
DataHub: Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
DataHub - Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
Federal role in science will grow, NSF Director predicts
NASA Astrophysics Data System (ADS)
Simarski, Lynn Teo
1992-01-01
Walter Massey, director of the National Science Foundation, recently called for a fundamental reassessment of the relationship between the federal government and research institutions. On January 15, Massey, now in his ninth month at NSF, described great changes in the government-university “partnership” since the “golden age” of the 1960s. Speaking in Washington, D.C. at a seminar of George Washington University's Center for International Science and Technology Policy, he predicted that his own term at the foundation would not be “business as usual.”Science and technology have shifted from being a peripheral concern of the government to a central policy issue, Massey said. The United States now sees science as too important to leave its agenda for scientists to set themselves. In response, the federal government is launching the initiatives of the Federal Coordinating Council for Science, Engineering, and Technology. Some of last year's FCCSET budget initiatives, spanning a number of federal agencies, dealt with math and science education, global change, and high-performance computing. Such programs “are research agenda put forth from the federal side—they are not things put forth from the [research] community,” Massey pointed out.
A synthetic design environment for ship design
NASA Technical Reports Server (NTRS)
Chipman, Richard R.
1995-01-01
Rapid advances in computer science and information system technology have made possible the creation of synthetic design environments (SDE) which use virtual prototypes to increase the efficiency and agility of the design process. This next generation of computer-based design tools will rely heavily on simulation and advanced visualization techniques to enable integrated product and process teams to concurrently conceptualize, design, and test a product and its fabrication processes. This paper summarizes a successful demonstration of the feasibility of using a simulation based design environment in the shipbuilding industry. As computer science and information science technologies have evolved, there have been many attempts to apply and integrate the new capabilities into systems for the improvement of the process of design. We see the benefits of those efforts in the abundance of highly reliable, technologically complex products and services in the modern marketplace. Furthermore, the computer-based technologies have been so cost effective that the improvements embodied in modern products have been accompanied by lowered costs. Today the state-of-the-art in computerized design has advanced so dramatically that the focus is no longer on merely improving design methodology; rather the goal is to revolutionize the entire process by which complex products are conceived, designed, fabricated, tested, deployed, operated, maintained, refurbished and eventually decommissioned. By concurrently addressing all life-cycle issues, the basic decision making process within an enterprise will be improved dramatically, leading to new levels of quality, innovation, efficiency, and customer responsiveness. By integrating functions and people with an enterprise, such systems will change the fundamental way American industries are organized, creating companies that are more competitive, creative, and productive.
U.S. Geological Survey Fundamental Science Practices
,
2011-01-01
The USGS has a long and proud tradition of objective, unbiased science in service to the Nation. A reputation for impartiality and excellence is one of our most important assets. To help preserve this vital asset, in 2004 the Executive Leadership Team (ELT) of the USGS was charged by the Director to develop a set of fundamental science practices, philosophical premises, and operational principles as the foundation for all USGS research and monitoring activities. In a concept document, 'Fundamental Science Practices of the U.S. Geological Survey', the ELT proposed 'a set of fundamental principles to underlie USGS science practices.' The document noted that protecting the reputation of USGS science for quality and objectivity requires the following key elements: - Clearly articulated, Bureau-wide fundamental science practices. - A shared understanding at all levels of the organization that the health and future of the USGS depend on following these practices. - The investment of budget, time, and people to ensure that the USGS reputation and high-quality standards are maintained. The USGS Fundamental Science Practices (FSP) encompass all elements of research investigations, including data collection, experimentation, analysis, writing results, peer review, management review, and Bureau approval and publication of information products. The focus of FSP is on how science is carried out and how products are produced and disseminated. FSP is not designed to address the question of what work the USGS should do; that is addressed in USGS science planning handbooks and other documents. Building from longstanding existing USGS policies and the ELT concept document, in May 2006, FSP policies were developed with input from all parts of the organization and were subsequently incorporated into the Bureau's Survey Manual. In developing an implementation plan for FSP policy, the intent was to recognize and incorporate the best of USGS current practices to obtain the optimum overall program for our science. In January 2009, the USGS moved to full implementation of FSP. The FSP Advisory Committee (FSPAC) was formed to serve as the Bureau's working and standing committee to ensure the objectivity and quality of the Bureau's science information products and to provide support for the full implementation of FSP.
Will the digital computer transform classical mathematics?
Rotman, Brian
2003-08-15
Mathematics and machines have influenced each other for millennia. The advent of the digital computer introduced a powerfully new element that promises to transform the relation between them. This paper outlines the thesis that the effect of the digital computer on mathematics, already widespread, is likely to be radical and far-reaching. To articulate this claim, an abstract model of doing mathematics is introduced based on a triad of actors of which one, the 'agent', corresponds to the function performed by the computer. The model is used to frame two sorts of transformation. The first is pragmatic and involves the alterations and progressive colonization of the content and methods of enquiry of various mathematical fields brought about by digital methods. The second is conceptual and concerns a fundamental antagonism between the infinity enshrined in classical mathematics and physics (continuity, real numbers, asymptotic definitions) and the inherently real and material limit of processes associated with digital computation. An example which lies in the intersection of classical mathematics and computer science, the P=NP problem, is analysed in the light of this latter issue.
Yang, Ting; Dong, Jianji; Lu, Liangjun; Zhou, Linjie; Zheng, Aoling; Zhang, Xinliang; Chen, Jianping
2014-07-04
Photonic integrated circuits for photonic computing open up the possibility for the realization of ultrahigh-speed and ultra wide-band signal processing with compact size and low power consumption. Differential equations model and govern fundamental physical phenomena and engineering systems in virtually any field of science and engineering, such as temperature diffusion processes, physical problems of motion subject to acceleration inputs and frictional forces, and the response of different resistor-capacitor circuits, etc. In this study, we experimentally demonstrate a feasible integrated scheme to solve first-order linear ordinary differential equation with constant-coefficient tunable based on a single silicon microring resonator. Besides, we analyze the impact of the chirp and pulse-width of input signals on the computing deviation. This device can be compatible with the electronic technology (typically complementary metal-oxide semiconductor technology), which may motivate the development of integrated photonic circuits for optical computing.
Yang, Ting; Dong, Jianji; Lu, Liangjun; Zhou, Linjie; Zheng, Aoling; Zhang, Xinliang; Chen, Jianping
2014-01-01
Photonic integrated circuits for photonic computing open up the possibility for the realization of ultrahigh-speed and ultra wide-band signal processing with compact size and low power consumption. Differential equations model and govern fundamental physical phenomena and engineering systems in virtually any field of science and engineering, such as temperature diffusion processes, physical problems of motion subject to acceleration inputs and frictional forces, and the response of different resistor-capacitor circuits, etc. In this study, we experimentally demonstrate a feasible integrated scheme to solve first-order linear ordinary differential equation with constant-coefficient tunable based on a single silicon microring resonator. Besides, we analyze the impact of the chirp and pulse-width of input signals on the computing deviation. This device can be compatible with the electronic technology (typically complementary metal-oxide semiconductor technology), which may motivate the development of integrated photonic circuits for optical computing. PMID:24993440
Statistical Methodologies to Integrate Experimental and Computational Research
NASA Technical Reports Server (NTRS)
Parker, P. A.; Johnson, R. T.; Montgomery, D. C.
2008-01-01
Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.
NASA Astrophysics Data System (ADS)
Chiao, Raymond Y.; Cohen, Marvin L.; Leggett, Anthony J.; Phillips, William D.; Harper, Charles L., Jr.
2010-10-01
List of contributors; Foreword Charles H. Townes; Editors' preface; Preface Freeman J. Dyson; Laureates' preface: reflections from four physics Nobelists Roy J. Glauber, John L. Hall, Theodore W. Hänsch and Wolfgang Ketterle; Acknowledgments; Part I. Illumination: The History and Future of Physical Science and Technology: 1. A short history of light in the Western world John L. Heilbron; 2. Tools and innovation Peter L. Galison; 3. The future of science Freeman J. Dyson; 4. The end of everything: will AI replace humans? Will everything die when the universe freezes over? Michio Kaku; Part II. Fundamental Physics and Quantum Mechanics: 5. Fundamental constants Frank Wilczek; 6. New insights on time symmetry in quantum mechanics Yakir Aharonov and Jeffrey Tollaksen; 7. The major unknowns in particle physics and cosmology David J. Gross; 8. The major unknown in quantum mechanics: Is it the whole truth? Anthony J. Leggett; 9. Precision cosmology and the landscape Raphael Bousso; 10. Hairy black holes, phase transitions, and AdS/CFT Steven S. Gubser; Part III. Astrophysics and Astronomy: 11. The microwave background: a cosmic time machine Adrian T. Lee; 12. Dark matter and dark energy Marc Kamionkowski; 13. New directions and intersections for observational cosmology: the case of dark energy Saul Perlmutter; 14. Inward bound: high-resolution astronomy and the quest for black holes and extrasolar planets Reinhard Genzel; 15. Searching for signatures of life beyond the solar system: astrophysical interferometry and the 150 km Exo-Earth Imager Antoine Labeyrie; 16. New directions for gravitational wave physics via 'Millikan oil drops' Raymond Y. Chiao; 17. An 'ultrasonic' image of the embryonic universe: CMB polarization tests of the inflationary paradigm Brian G. Keating; Part IV. New Approaches in Technology and Science: 18. Visualizing complexity: development of 4D microscopy and diffraction for imaging in space and time Ahmed H. Zewail; 19. Is life based on laws of physics? Steven Chu; 20. Quantum information J. Ignacio Cirac; 21. Emergence in condensed matter physics Marvin L. Cohen; 22. Achieving the highest spectral resolution over the widest spectral bandwidth: precision measurement meets ultrafast science Jun Ye; 23. Wireless non-radiative energy transfer Marin Soljačić; Part V. Consciousness and Free Will: 24. The big picture: exploring questions on the boundaries of science - consciousness and free will George F. R. Ellis; 25. Quantum entanglement: from fundamental questions to quantum communication and quantum computation and back Anton Zeilinger; 26. Consciousness, body, and brain: the matter of the mind Gerald M. Edelman; 27. The relation between quantum mechanics and higher brain functions: lessons from quantum computation and neurobiology Christof Koch and Klaus Hepp; 28. Free will and the causal closure of physics Robert C. Bishop; 29. Natural laws and the closure of physics Nancy L. Cartwright; 30. Anti-Cartesianism and downward causation: reshaping the free-will debate Nancey Murphy; 31. Can we understand free will? Charles H. Townes; Part VI. Reflections on the Big Questions: Mind, Matter. Mathematics, and Ultimate Reality: 32. The big picture: exploring questions on the boundaries of science - mind, matter, mathematics George F. R. Ellis; 33. The mathematical universe Max Tegmark; 34. Where do the laws of physics come from? Paul C. W. Davies; 35. Science, energy, ethics, and civilization Vaclav Smil; 36. Life of science, life of faith William T. Newsome; 37. The science of light and the light of science: an appreciative theological reflection on the life and work of Charles Hard Townes Robert J. Russell; 38. Two quibbles about 'ultimate' Gerald Gabrielse; Index.
NASA Astrophysics Data System (ADS)
Schumacher, Florian; Friederich, Wolfgang
Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).
NASA Astrophysics Data System (ADS)
2014-12-01
This special issue of Applied Surface Science is a compilation of papers inspired by the symposium on "Surface/Interfaces Characterization and Renewable Energy" held at the 2013 MRS Fall Meeting. Practical uses of renewable energy are one of the greatest technical challenges today. The symposium explored a number of surface and interface-related questions relevant to this overarching theme. Topics from fuel cells to photovoltaics, from water splitting to fundamental and practical issues in charge generation and storage were discussed. The work presented included the use of novel experimental spectroscopic and microscopic analytical techniques, theoretical and computational understanding of interfacial phenomena, characterization of intricate behavior of charged species, as well as molecules and molecular fragments at surfaces and interfaces. It emphasized fundamental understanding of underlying processes, as well as practical devices design and applications of surface and interfacial phenomena related to renewable energy. These subjects are complicated by the transport of photons, electrons, ions, heat, and almost any other form of energy. Given the current concerns of climate change, energy independence and national security, this work is important and of interest to the field of Applied Surface Science. The sixteen papers published in this special issue have all been refereed.
Do Racial and Gender Disparities Exist in Newer Glaucoma Treatments?
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
Earth System Science Education Interdisciplinary Partnerships
NASA Astrophysics Data System (ADS)
Ruzek, M.; Johnson, D. R.
2002-05-01
Earth system science in the classroom is the fertile crucible linking science with societal needs for local, national and global sustainability. The interdisciplinary dimension requires fruitful cooperation among departments, schools and colleges within universities and among the universities and the nation's laboratories and agencies. Teaching and learning requires content which brings together the basic and applied sciences with mathematics and technology in addressing societal challenges of the coming decades. Over the past decade remarkable advances have emerged in information technology, from high bandwidth Internet connectivity to raw computing and visualization power. These advances which have wrought revolutionary capabilities and resources are transforming teaching and learning in the classroom. With the launching of NASA's Earth Observing System (EOS) the amount and type of geophysical data to monitor the Earth and its climate are increasing dramatically. The challenge remains, however, for skilled scientists and educators to interpret this information based upon sound scientific perspectives and utilize it in the classroom. With an increasing emphasis on the application of data gathered, and the use of the new technologies for practical benefit in the lives of ordinary citizens, there comes the even more basic need for understanding the fundamental state, dynamics, and complex interdependencies of the Earth system in mapping valid and relevant paths to sustainability. Technology and data in combination with the need to understand Earth system processes and phenomena offer opportunities for new and productive partnerships between researchers and educators to advance the fundamental science of the Earth system and in turn through discovery excite students at all levels in the classroom. This presentation will discuss interdisciplinary partnership opportunities for educators and researchers at the undergraduate and graduate levels.
ERIC Educational Resources Information Center
Chin, Chi-Chin; Yang, Wei-Cheng; Tuan, Hsiao-Lin
2016-01-01
This study explored the effects of arguing to learn in a socioscientific context on the fundamental and derived components of reading, writing, and science understanding as integral parts of science literacy. We adopted mixed-methods in which the 1-group pretest-posttest design with supplemental interviews and questionnaires. The pretest evaluated…
Workshop on Fundamental Science using Pulsed Power
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wootton, Alan
The project objective was to fund travel to a workshop organized by the Institute for High Energy Density Science (IHEDS) at the University of Texas at Austin. In so doing the intent was to a) Grow the national academic High Energy Density Science (HEDS) community, b) Expand high impact, discovery driven fundamental HEDS, and c) Facilitate user-oriented research
ERIC Educational Resources Information Center
Canadian Association of University Teachers, 2017
2017-01-01
Canadian Association of University Teachers (CAUT) welcomes the report of the Advisory Panel on Federal Support for Fundamental Science "the Panel". It is a thoughtful and comprehensive study that correctly diagnoses problems that have plagued basic science for over a decade. The Panel's recommendations, if implemented, will chart a…
Fundamental plant biology enabled by the space shuttle.
Paul, Anna-Lisa; Wheeler, Ray M; Levine, Howard G; Ferl, Robert J
2013-01-01
The relationship between fundamental plant biology and space biology was especially synergistic in the era of the Space Shuttle. While all terrestrial organisms are influenced by gravity, the impact of gravity as a tropic stimulus in plants has been a topic of formal study for more than a century. And while plants were parts of early space biology payloads, it was not until the advent of the Space Shuttle that the science of plant space biology enjoyed expansion that truly enabled controlled, fundamental experiments that removed gravity from the equation. The Space Shuttle presented a science platform that provided regular science flights with dedicated plant growth hardware and crew trained in inflight plant manipulations. Part of the impetus for plant biology experiments in space was the realization that plants could be important parts of bioregenerative life support on long missions, recycling water, air, and nutrients for the human crew. However, a large part of the impetus was that the Space Shuttle enabled fundamental plant science essentially in a microgravity environment. Experiments during the Space Shuttle era produced key science insights on biological adaptation to spaceflight and especially plant growth and tropisms. In this review, we present an overview of plant science in the Space Shuttle era with an emphasis on experiments dealing with fundamental plant growth in microgravity. This review discusses general conclusions from the study of plant spaceflight biology enabled by the Space Shuttle by providing historical context and reviews of select experiments that exemplify plant space biology science.
Federated data storage system prototype for LHC experiments and data intensive science
NASA Astrophysics Data System (ADS)
Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.
2017-10-01
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.
Laboratory Directed Research and Development FY2010 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, K J
2011-03-22
A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has at its core a primary national security mission - to ensure the safety, security, and reliability of the nation's nuclear weapons stockpile without nuclear testing, and to prevent and counter the spread and use of weapons of mass destruction: nuclear, chemical, and biological. The Laboratory uses the scientific and engineering expertise and facilities developed for its primary mission to pursue advanced technologies to meet other important national security needs - homeland defense, military operations, and missile defense, for example - that evolve in response to emerging threats. For broader nationalmore » needs, LLNL executes programs in energy security, climate change and long-term energy needs, environmental assessment and management, bioscience and technology to improve human health, and for breakthroughs in fundamental science and technology. With this multidisciplinary expertise, the Laboratory serves as a science and technology resource to the U.S. government and as a partner with industry and academia. This annual report discusses the following topics: (1) Advanced Sensors and Instrumentation; (2) Biological Sciences; (3) Chemistry; (4) Earth and Space Sciences; (5) Energy Supply and Use; (6) Engineering and Manufacturing Processes; (7) Materials Science and Technology; Mathematics and Computing Science; (8) Nuclear Science and Engineering; and (9) Physics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D E; McGraw, J R
2006-02-02
Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Office of The Director)
As a national laboratory Argonne concentrates on scientific and technological challenges that can only be addressed through a sustained, interdisciplinary focus at a national scale. Argonne's eight major initiatives, as enumerated in its strategic plan, are Hard X-ray Sciences, Leadership Computing, Materials and Molecular Design and Discovery, Energy Storage, Alternative Energy and Efficiency, Nuclear Energy, Biological and Environmental Systems, and National Security. The purposes of Argonne's Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel technical concepts, enhance the Laboratory's research and development (R and D) capabilities, and pursue its strategic goals. projects are selectedmore » from proposals for creative and innovative R and D studies that require advance exploration before they are considered to be sufficiently developed to obtain support through normal programmatic channels. Among the aims of the projects supported by the LDRD Program are the following: establishment of engineering proof of principle, assessment of design feasibility for prospective facilities, development of instrumentation or computational methods or systems, and discoveries in fundamental science and exploratory development.« less
Applications of artificial neural networks in medical science.
Patel, Jigneshkumar L; Goyal, Ramesh K
2007-09-01
Computer technology has been advanced tremendously and the interest has been increased for the potential use of 'Artificial Intelligence (AI)' in medicine and biological research. One of the most interesting and extensively studied branches of AI is the 'Artificial Neural Networks (ANNs)'. Basically, ANNs are the mathematical algorithms, generated by computers. ANNs learn from standard data and capture the knowledge contained in the data. Trained ANNs approach the functionality of small biological neural cluster in a very fundamental manner. They are the digitized model of biological brain and can detect complex nonlinear relationships between dependent as well as independent variables in a data where human brain may fail to detect. Nowadays, ANNs are widely used for medical applications in various disciplines of medicine especially in cardiology. ANNs have been extensively applied in diagnosis, electronic signal analysis, medical image analysis and radiology. ANNs have been used by many authors for modeling in medicine and clinical research. Applications of ANNs are increasing in pharmacoepidemiology and medical data mining. In this paper, authors have summarized various applications of ANNs in medical science.
NASA Astrophysics Data System (ADS)
Walter, R. J.; Protack, S. P.; Harris, C. J.; Caruthers, C.; Kusterer, J. M.
2008-12-01
NASA's Atmospheric Science Data Center at the NASA Langley Research Center performs all of the science data processing for the Multi-angle Imaging SpectroRadiometer (MISR) instrument. MISR is one of the five remote sensing instruments flying aboard NASA's Terra spacecraft. From the time of Terra launch in December 1999 until February 2008, all MISR science data processing was performed on a Silicon Graphics, Inc. (SGI) platform. However, dramatic improvements in commodity computing technology coupled with steadily declining project budgets during that period eventually made transitioning MISR processing to a commodity computing environment both feasible and necessary. The Atmospheric Science Data Center has successfully ported the MISR science data processing environment from the SGI platform to a Linux cluster environment. There were a multitude of technical challenges associated with this transition. Even though the core architecture of the production system did not change, the manner in which it interacted with underlying hardware was fundamentally different. In addition, there are more potential throughput bottlenecks in a cluster environment than there are in a symmetric multiprocessor environment like the SGI platform and each of these had to be addressed. Once all the technical issues associated with the transition were resolved, the Atmospheric Science Data Center had a MISR science data processing system with significantly higher throughput than the SGI platform at a fraction of the cost. In addition to the commodity hardware, free and open source software such as S4PM, Sun Grid Engine, PostgreSQL and Ganglia play a significant role in the new system. Details of the technical challenges and resolutions, software systems, performance improvements, and cost savings associated with the transition will be discussed. The Atmospheric Science Data Center in Langley's Science Directorate leads NASA's program for the processing, archival and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The Data Center was established in 1991 to support NASA's Earth Observing System and the U.S. Global Change Research Program. It is unique among NASA data centers in the size of its archive, cutting edge computing technology, and full range of data services. For more information regarding ASDC data holdings, documentation, tools and services, visit http://eosweb.larc.nasa.gov
Single-molecule protein sequencing through fingerprinting: computational assessment
NASA Astrophysics Data System (ADS)
Yao, Yao; Docter, Margreet; van Ginkel, Jetty; de Ridder, Dick; Joo, Chirlmin
2015-10-01
Proteins are vital in all biological systems as they constitute the main structural and functional components of cells. Recent advances in mass spectrometry have brought the promise of complete proteomics by helping draft the human proteome. Yet, this commonly used protein sequencing technique has fundamental limitations in sensitivity. Here we propose a method for single-molecule (SM) protein sequencing. A major challenge lies in the fact that proteins are composed of 20 different amino acids, which demands 20 molecular reporters. We computationally demonstrate that it suffices to measure only two types of amino acids to identify proteins and suggest an experimental scheme using SM fluorescence. When achieved, this highly sensitive approach will result in a paradigm shift in proteomics, with major impact in the biological and medical sciences.
Protein Science by DNA Sequencing: How Advances in Molecular Biology Are Accelerating Biochemistry.
Higgins, Sean A; Savage, David F
2018-01-09
A fundamental goal of protein biochemistry is to determine the sequence-function relationship, but the vastness of sequence space makes comprehensive evaluation of this landscape difficult. However, advances in DNA synthesis and sequencing now allow researchers to assess the functional impact of every single mutation in many proteins, but challenges remain in library construction and the development of general assays applicable to a diverse range of protein functions. This Perspective briefly outlines the technical innovations in DNA manipulation that allow massively parallel protein biochemistry and then summarizes the methods currently available for library construction and the functional assays of protein variants. Areas in need of future innovation are highlighted with a particular focus on assay development and the use of computational analysis with machine learning to effectively traverse the sequence-function landscape. Finally, applications in the fundamentals of protein biochemistry, disease prediction, and protein engineering are presented.
LIGO-India: expanding the international network of gravitational wave detectors
NASA Astrophysics Data System (ADS)
Iyer, Balasubramanian
2015-04-01
The first detection of Gravitational Waves (GW) by ground based detectors will open up a fundamentally new observational window to the Universe with implications for astrophysics and eventually cosmology and fundamental physics. The realization of GW astronomy requires a global network of Advanced GW detectors including upcoming observatories like KAGRA (Japan) and LIGO-India to provide good sky localization of the GW sources. LIGO-India is expected to play a key role in locating and deciphering the sources contributing to the GW symphony. The current status of LIGO-India project and the exciting future research opportunities of this ambitious Indo-US collaboration in science, technology and computation will be finally indicated. Acknowledge CISA and APS for the Award of a APS Beller Lectureship. BRI supported by the AIRBUS Group Corporate Foundation through a visiting professorship, which is part of the ``Mathematics of Complex Systems'' chair at ICTS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, H.T.; Scriven, L.E.
1991-07-01
A major program of university research, longer-ranged and more fundamental in approach than industrial research, into basic mechanisms of enhancing petroleum recovery and into underlying physics, chemistry, geology, applied mathematics, computation, and engineering science has been built at Minnesota. The original focus was surfactant-based chemical flooding, but the approach taken was sufficiently fundamental that the research, longer-ranged than industrial efforts, has become quite multidirectional. Topics discussed are volume controlled porosimetry; fluid distribution and transport in porous media at low wetting phase saturation; molecular dynamics of fluids in ultranarrow pores; molecular dynamics and molecular theory of wetting and adsorption; new numericalmore » methods to handle initial and boundary conditions in immiscible displacement; electron microscopy of surfactant fluid microstructure; low cost system for animating liquid crystallites viewed with polarized light; surfaces of constant mean curvature with prescribed contact angle.« less
In silico design of smart binders to anthrax PA
NASA Astrophysics Data System (ADS)
Sellers, Michael; Hurley, Margaret M.
2012-06-01
The development of smart peptide binders requires an understanding of the fundamental mechanisms of recognition which has remained an elusive grail of the research community for decades. Recent advances in automated discovery and synthetic library science provide a wealth of information to probe fundamental details of binding and facilitate the development of improved models for a priori prediction of affinity and specificity. Here we present the modeling portion of an iterative experimental/computational study to produce high affinity peptide binders to the Protective Antigen (PA) of Bacillus anthracis. The result is a general usage, HPC-oriented, python-based toolkit based upon powerful third-party freeware, which is designed to provide a better understanding of peptide-protein interactions and ultimately predict and measure new smart peptide binder candidates. We present an improved simulation protocol with flexible peptide docking to the Anthrax Protective Antigen, reported within the context of experimental data presented in a companion work.
Design Process of a Goal-Based Scenario on Computing Fundamentals
ERIC Educational Resources Information Center
Beriswill, Joanne Elizabeth
2014-01-01
In this design case, an instructor developed a goal-based scenario (GBS) for undergraduate computer fundamentals students to apply their knowledge of computer equipment and software. The GBS, entitled the MegaTech Project, presented the students with descriptions of the everyday activities of four persons needing to purchase a computer system. The…
NASA Astrophysics Data System (ADS)
Shi, X.
2015-12-01
As NSF indicated - "Theory and experimentation have for centuries been regarded as two fundamental pillars of science. It is now widely recognized that computational and data-enabled science forms a critical third pillar." Geocomputation is the third pillar of GIScience and geosciences. With the exponential growth of geodata, the challenge of scalable and high performance computing for big data analytics become urgent because many research activities are constrained by the inability of software or tool that even could not complete the computation process. Heterogeneous geodata integration and analytics obviously magnify the complexity and operational time frame. Many large-scale geospatial problems may be not processable at all if the computer system does not have sufficient memory or computational power. Emerging computer architectures, such as Intel's Many Integrated Core (MIC) Architecture and Graphics Processing Unit (GPU), and advanced computing technologies provide promising solutions to employ massive parallelism and hardware resources to achieve scalability and high performance for data intensive computing over large spatiotemporal and social media data. Exploring novel algorithms and deploying the solutions in massively parallel computing environment to achieve the capability for scalable data processing and analytics over large-scale, complex, and heterogeneous geodata with consistent quality and high-performance has been the central theme of our research team in the Department of Geosciences at the University of Arkansas (UARK). New multi-core architectures combined with application accelerators hold the promise to achieve scalability and high performance by exploiting task and data levels of parallelism that are not supported by the conventional computing systems. Such a parallel or distributed computing environment is particularly suitable for large-scale geocomputation over big data as proved by our prior works, while the potential of such advanced infrastructure remains unexplored in this domain. Within this presentation, our prior and on-going initiatives will be summarized to exemplify how we exploit multicore CPUs, GPUs, and MICs, and clusters of CPUs, GPUs and MICs, to accelerate geocomputation in different applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Ann E; Bland, Arthur S Buddy; Hack, James J
Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor thatmore » uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where appropriate, changes in Center metrics were introduced. This report covers CY 2010 and CY 2011 Year to Date (YTD) that unless otherwise specified, denotes January 1, 2011 through June 30, 2011. User Support remains an important element of the OLCF operations, with the philosophy 'whatever it takes' to enable successful research. Impact of this center-wide activity is reflected by the user survey results that show users are 'very satisfied.' The OLCF continues to aggressively pursue outreach and training activities to promote awareness - and effective use - of U.S. leadership-class resources (Reference Section 2). The OLCF continues to meet and in many cases exceed DOE metrics for capability usage (35% target in CY 2010, delivered 39%; 40% target in CY 2011, 54% January 1, 2011 through June 30, 2011). The Schedule Availability (SA) and Overall Availability (OA) for Jaguar were exceeded in CY2010. Given the solution to the VRM problem the SA and OA for Jaguar in CY 2011 are expected to exceed the target metrics of 95% and 90%, respectively (Reference Section 3). Numerous and wide-ranging research accomplishments, scientific support, and technological innovations are more fully described in Sections 4 and 6 and reflect OLCF leadership in enabling high-impact science solutions and vision in creating an exascale-ready center. Financial Management (Section 5) and Risk Management (Section 7) are carried out using best practices approved of by DOE. The OLCF has a valid cyber security plan and Authority to Operate (Section 8). The proposed metrics for 2012 are reflected in Section 9.« less
Physical Sciences Research Priorities and Plans in OBPR
NASA Technical Reports Server (NTRS)
Trinh, Eugene
2002-01-01
This paper presents viewgraphs of physical sciences research priorities and plans at the Office of Biological and Physical Sciences Research (OBPR). The topics include: 1) Sixth Microgravity Fluid Physics and Transport Phenomena Conference; 2) Beneficial Characteristics of the Space Environment; 3) Windows of Opportunity for Research Derived from Microgravity; 4) Physical Sciences Research Program; 5) Fundamental Research: Space-based Results and Ground-based Applications; 6) Nonlinear Oscillations; and 7) Fundamental Research: Applications to Mission-Oriented Research.
Sma3s: A universal tool for easy functional annotation of proteomes and transcriptomes.
Casimiro-Soriguer, Carlos S; Muñoz-Mérida, Antonio; Pérez-Pulido, Antonio J
2017-06-01
The current cheapening of next-generation sequencing has led to an enormous growth in the number of sequenced genomes and transcriptomes, allowing wet labs to get the sequences from their organisms of study. To make the most of these data, one of the first things that should be done is the functional annotation of the protein-coding genes. But it used to be a slow and tedious step that can involve the characterization of thousands of sequences. Sma3s is an accurate computational tool for annotating proteins in an unattended way. Now, we have developed a completely new version, which includes functionalities that will be of utility for fundamental and applied science. Currently, the results provide functional categories such as biological processes, which become useful for both characterizing particular sequence datasets and comparing results from different projects. But one of the most important implemented innovations is that it has now low computational requirements, and the complete annotation of a simple proteome or transcriptome usually takes around 24 hours in a personal computer. Sma3s has been tested with a large amount of complete proteomes and transcriptomes, and it has demonstrated its potential in health science and other specific projects. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
10 Tips to Reduce Your Chance of Losing Vision from the Most Common Cause of Blindness
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
Development of National Map ontologies for organization and orchestration of hydrologic observations
NASA Astrophysics Data System (ADS)
Lieberman, J. E.
2014-12-01
Feature layers in the National Map program (TNM) are a fundamental context for much of the data collection and analysis conducted by the USGS and other governmental and nongovernmental organizations. Their computational usefulness, though, has been constrained by the lack of formal relationships besides superposition between TNM layers, as well as limited means of representing how TNM datasets relate to additional attributes, datasets, and activities. In the field of Geospatial Information Science, there has been a growing recognition of the value of semantic representation and technology for addressing these limitations, particularly in the face of burgeoning information volume and heterogeneity. Fundamental to this approach is the development of formal ontologies for concepts related to that information that can be processed computationally to enhance creation and discovery of new geospatial knowledge. They offer a means of making much of the presently innate knowledge about relationships in and between TNM features accessible for machine processing and distributed computation.A full and comprehensive ontology of all knowledge represented by TNM features is still impractical. The work reported here involves elaboration and integration of a number of small ontology design patterns (ODP's) that represent limited, discrete, but commonly accepted and broadly applicable physical theories for the behavior of TNM features representing surface water bodies and landscape surfaces and the connections between them. These ontology components are validated through use in applications for discovery and aggregation of water science observational data associated with National Hydrography Data features, features from the National Elevation Dataset (NED) and Water Boundary Dataset (WBD) that constrain water occurrence in the continental US. These applications emphasize workflows which are difficult or impossible to automate using existing data structures. Evaluation of the usefulness of the developed ontology components includes both solicitation of feedback on prototype applications, and provision of a query / mediation service for feature-linked data to facilitate development of additional third-party applications.
The Future of Pharmaceutical Manufacturing Sciences
2015-01-01
The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial‐scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state‐of‐art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular‐based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot‐melt processing and printing‐based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3612–3638, 2015 PMID:26280993
Foundations in Science and Mathematics Program for Middle School and High School Students
NASA Astrophysics Data System (ADS)
Desai, Karna Mahadev; Yang, Jing; Hemann, Jason
2016-01-01
The Foundations in Science and Mathematics (FSM) is a graduate student led summer program designed to help middle school and high school students strengthen their knowledge and skills in mathematics and science. FSM provides two-week-long courses over a broad spectrum of disciplines including astronomy, biology, chemistry, computer programming, geology, mathematics, and physics. Students can chose two types of courses: (1) courses that help students learn the fundamental concepts in basic sciences and mathematics (e.g., "Precalculus"); and (2) knowledge courses that might be excluded from formal schooling (e.g., "Introduction to Universe"). FSM has served over 500 students in the Bloomington, IN, community over six years by acquiring funding from Indiana University and the Indiana Space Grant Consortium. FSM offers graduate students the opportunity to obtain first hand experience through independent teaching and curriculum design as well as leadership experience.We present the design of the program, review the achievements, and explore the challenges we face. We are open to collaboration with similar educational outreach programs. For more information, please visit http://www.indiana.edu/~fsm/ .
Fundamental Stellar Properties of M-Dwarfs from the CHARA Array
NASA Astrophysics Data System (ADS)
Berger, D. H.; Gies, D. R.; McAlister, H. A.; ten Brummelaar, T. A.; Henry, T. J.; Sturmann, J.; Sturmann, L.; Turner, N. H.; Ridgway, S. T.; Aufdenberg, J. P.; Mérand, A. M.
2005-12-01
We report the angular diameters of six M dwarfs ranging in spectral type from M1.0 V to M3.0 V measured with Georgia State University's CHARA Array, a long-baseline optical interferometer located at Mount Wilson Observatory. Observations were made with the longest baselines in the near infrared K'-band and yielded angular diameters less than one milliarcsecond. Using an iterative process combining parallaxes from the NStars program and photometrically-derived bolometric luminosities and masses, we calculated effective temperatures, surface gravities, and stellar radii. Our results are consistent with other empirical measurements of M-dwarf radii, but found that current models underestimate the true stellar radii by up to 15-20%. We suggest that theoretical models for low mass stars may be lacking an opacity source that alters the computed stellar radii. Science operations at the Array are supported by the National Science Foundation through NSF Grant AST--0307562 and by Georgia State University through the College of Arts and Sciences and the Office of the Vice President for Research. Financial support for DHB was provided by the National Science Foundation through grant AST--0205297.
A brief history of the most remarkable numbers e, i and γ in mathematical sciences with applications
NASA Astrophysics Data System (ADS)
Debnath, Lokenath
2015-08-01
This paper deals with a brief history of the most remarkable Euler numbers e, i and γ in mathematical sciences. Included are many properties of the constants e, i and γ and their applications in algebra, geometry, physics, chemistry, ecology, business and industry. Special attention is given to the growth and decay phenomena in many real-world problems including stability and instability of their solutions. Some specific and modern applications of logarithms, complex numbers and complex exponential functions to electrical circuits and mechanical systems are presented with examples. Included are the use of complex numbers and complex functions in the description and analysis of chaos and fractals with the aid of modern computer technology. In addition, the phasor method is described with examples of applications in engineering science. The major focus of this paper is to provide basic information through historical approach to mathematics teaching and learning of the fundamental knowledge and skills required for students and teachers at all levels so that they can understand the concepts of mathematics, and mathematics education in science and technology.
The Future of Pharmaceutical Manufacturing Sciences.
Rantanen, Jukka; Khinast, Johannes
2015-11-01
The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial-scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state-of-art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular-based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot-melt processing and printing-based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.
Bartschat, Klaus; Kushner, Mark J.
2016-01-01
Electron collisions with atoms, ions, molecules, and surfaces are critically important to the understanding and modeling of low-temperature plasmas (LTPs), and so in the development of technologies based on LTPs. Recent progress in obtaining experimental benchmark data and the development of highly sophisticated computational methods is highlighted. With the cesium-based diode-pumped alkali laser and remote plasma etching of Si3N4 as examples, we demonstrate how accurate and comprehensive datasets for electron collisions enable complex modeling of plasma-using technologies that empower our high-technology–based society. PMID:27317740
NASA Astrophysics Data System (ADS)
Gong, Weiwei; Zhou, Xu
2017-06-01
In Computer Science, the Boolean Satisfiability Problem(SAT) is the problem of determining if there exists an interpretation that satisfies a given Boolean formula. SAT is one of the first problems that was proven to be NP-complete, which is also fundamental to artificial intelligence, algorithm and hardware design. This paper reviews the main algorithms of the SAT solver in recent years, including serial SAT algorithms, parallel SAT algorithms, SAT algorithms based on GPU, and SAT algorithms based on FPGA. The development of SAT is analyzed comprehensively in this paper. Finally, several possible directions for the development of the SAT problem are proposed.
Introduction to autonomous mobile robotics using Lego Mindstorms NXT
NASA Astrophysics Data System (ADS)
Akın, H. Levent; Meriçli, Çetin; Meriçli, Tekin
2013-12-01
Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the Lego Mindstorms NXT kits are used as the robot platform. The aims, scope and contents of the course are presented, and the design of the laboratory sessions as well as the term projects, which address several core problems of robotics and artificial intelligence simultaneously, are explained in detail.
New theory insights and experimental opportunities in Majorana wires
NASA Astrophysics Data System (ADS)
Alicea, Jason
Over the past decade, the quest for Majorana zero modes in exotic superconductors has undergone transformational advances on the design, fabrication, detection, and characterization fronts. The field now seems primed for a new era aimed at Majorana control and readout. This talk will survey intertwined theory and experimental developments that illuminate a practical path toward these higher-level goals. In particular, I will highlight near-term opportunities for testing fundamentals of topological quantum computing and longer-term strategies for building scalable hardware. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.
The quiet revolution of numerical weather prediction.
Bauer, Peter; Thorpe, Alan; Brunet, Gilbert
2015-09-03
Advances in numerical weather prediction represent a quiet revolution because they have resulted from a steady accumulation of scientific knowledge and technological advances over many years that, with only a few exceptions, have not been associated with the aura of fundamental physics breakthroughs. Nonetheless, the impact of numerical weather prediction is among the greatest of any area of physical science. As a computational problem, global weather prediction is comparable to the simulation of the human brain and of the evolution of the early Universe, and it is performed every day at major operational centres across the world.
Aurorasaurus Database of Real-Time, Soft-Sensor Sourced Aurora Data for Space Weather Research
NASA Astrophysics Data System (ADS)
Kosar, B.; MacDonald, E.; Heavner, M.
2017-12-01
Aurorasaurus is an innovative citizen science project focused on two fundamental objectives i.e., collecting real-time, ground-based signals of auroral visibility from citizen scientists (soft-sensors) and incorporating this new type of data into scientific investigations pertaining to aurora. The project has been live since the Fall of 2014, and as of Summer 2017, the database compiled approximately 12,000 observations (5295 direct reports and 6413 verified tweets). In this presentation, we will focus on demonstrating the utility of this robust science quality data for space weather research needs. These data scale with the size of the event and are well-suited to capture the largest, rarest events. Emerging state-of-the-art computational methods based on statistical inference such as machine learning frameworks and data-model integration methods can offer new insights that could potentially lead to better real-time assessment and space weather prediction when citizen science data are combined with traditional sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prakash, Arushi; Baer, Marcel D.; Mundy, Christopher J.
Peptoids are peptide-mimetic biopolymers that are easy-to-synthesize and adaptable for use in drugs, chemical scaffolds, and coatings. However, there is insufficient information about their structural preferences and interactions with the environment in various applications. We conducted a study to understand the fundamental differences between peptides and peptoids using molecular dynamics simulations with semi-empirical (PM6) and empirical (AMBER) potentials, in conjunction with metadynamics enhanced sampling. From studies of single molecules in water and on surfaces, we found that sarcosine (model peptoid) is much more flexible than alanine (model peptide) in different environments. However, the sarcosine and alanine interact similarly with amore » hydrophobic or a hydrophilic. Finally, this study highlights the conformational landscape of peptoids and the dominant interactions that drive peptoids towards these conformations. ACKNOWLEDGMENT: MD simulations and manuscript preparation were supported by the MS3 (Materials Synthesis and Simulation Across Scales) Initiative at Pacific Northwest National Laboratory (PNNL), a multi-program national laboratory operated by Battelle for the U.S. Department of Energy. CJM was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by the US Department of Energy, Office of Basic Energy Sciences, Biomolecular Materials Program at PNNL. Computing resources were generously allocated by University of Washington's IT department and PNNL's Institutional Computing program. The authors greatly acknowledge conversations with Dr. Kayla Sprenger, Josh Smith, and Dr. Yeneneh Yimer.« less
NASA Astrophysics Data System (ADS)
Maskey, Manil; Ramachandran, Rahul; Kuo, Kwo-Sen
2015-04-01
The Collaborative WorkBench (CWB) has been successfully developed to support collaborative science algorithm development. It incorporates many features that enable and enhance science collaboration, including the support for both asynchronous and synchronous modes of interactions in collaborations. With the former, members in a team can share a full range of research artifacts, e.g. data, code, visualizations, and even virtual machine images. With the latter, they can engage in dynamic interactions such as notification, instant messaging, file exchange, and, most notably, collaborative programming. CWB also implements behind-the-scene provenance capture as well as version control to relieve scientists of these chores. Furthermore, it has achieved a seamless integration between researchers' local compute environments and those of the Cloud. CWB has also been successfully extended to support instrument verification and validation. Adopted by almost every researcher, the current practice of downloading data to local compute resources for analysis results in much duplication and inefficiency. CWB leverages Cloud infrastructure to provide a central location for data used by an entire science team, thereby eliminating much of this duplication and waste. Furthermore, use of CWB in concert with this same Cloud infrastructure enables co-located analysis with data where opportunities of data-parallelism can be better exploited, thereby further improving efficiency. With its collaboration-enabling features apposite to steps throughout the scientific process, we expect CWB to fundamentally transform research collaboration and realize maximum science productivity.
Science in the Eyes of Preschool Children: Findings from an Innovative Research Tool
NASA Astrophysics Data System (ADS)
Dubosarsky, Mia D.
How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.
22 CFR 120.11 - Public domain.
Code of Federal Regulations, 2010 CFR
2010-04-01
.... Fundamental research is defined to mean basic and applied research in science and engineering where the... fundamental research in science and engineering at accredited institutions of higher learning in the U.S... distinguished from research the results of which are restricted for proprietary reasons or specific U.S...
NASA Astrophysics Data System (ADS)
Fisher, J. A.; Brewer, C.; O'Brien, G.
2017-12-01
Computing and programming are rapidly becoming necessary skills for earth and environmental scientists. Scientists in both academia and industry must be able to manipulate increasingly large datasets, create plots and 3-D visualisations of observations, and interpret outputs from complex numerical models, among other tasks. However, these skills are rarely taught as a compulsory part of undergraduate earth science curricula. In 2016, the School of Earth & Environmental Sciences at the University of Wollongong began a pilot program to integrate introductory programming and modelling skills into the required first-year core curriculum for all undergraduates majoring in earth and environmental science fields. Using Python, a popular teaching language also widely used by professionals, a set of guided exercises were developed. These exercises use interactive Jupyter Notebooks to introduce students to programming fundamentals and simple modelling problems relevant to the earth system, such as carbon cycling and population growth. The exercises are paired with peer review activities to expose students to the multitude of "correct" ways to solve computing problems. In the last weeks of the semester, students work in groups to creatively adapt their new-found skills to selected problems in earth system science. In this presentation, I will report on outcomes from delivering the new curriculum to the first two cohorts of 120-150 students, including details of the implementation and the impacts on both student aptitude and attitudes towards computing. While the first cohort clearly developed competency, survey results suggested a drop in student confidence over the course of the semester. To address this confidence gap for the second cohort, the in-class activities are now being supplemented with low-stakes open-book review quizzes that provide further practice with no time pressure. Research into the effectiveness of these review quizzes is ongoing and preliminary findings will be discussed, along with lessons learned in the process and plans for the future.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
Fundamentals of Library Automation and Technology. Participant Workbook.
ERIC Educational Resources Information Center
Bridge, Frank; Walton, Robert
This workbook presents outlines of topics to be covered during a two-day workshop on the fundamentals for library automation. Topics for the first day include: (1) Introduction; (2) Computer Technology--A Historical Overview; (3) Evolution of Library Automation; (4) Computer Hardware Technology--An Introduction; (5) Computer Software…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cizewski, J.A., E-mail: cizewski@rutgers.edu
The Stewardship Science Academic Alliances (SSAA) were inaugurated in 2002 by the National Nuclear Security Administration of the U. S. Department of Energy. The purpose is to enhance connections between NNSA laboratories and the activities of university scientists and their students in research areas important to NNSA, including low-energy nuclear science. This paper highlights some of the ways that the SSAA fosters education and training of graduate students and postdoctoral scholars in low-energy nuclear science, preparing them for careers in fundamental and applied research and development.
BioSIGHT: Interactive Visualization Modules for Science Education
NASA Technical Reports Server (NTRS)
Wong, Wee Ling
1998-01-01
Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high- speed network capabilities. The BioSIGHT project at IMSC is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches towards the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science, Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students. Our collaborators include TERC, a research and education organization with extensive k-12 math and science curricula development from Cambridge, MA.; SRI International of Menlo Park, CA.; teachers and students from local area high schools (Newbury Park High School, USC's Family of Five schools, Chadwick School, and Pasadena Polytechnic High School).
Fundamental science behind today's important medicines.
Spector, Jonathan M; Harrison, Rosemary S; Fishman, Mark C
2018-04-25
Today's most transformative medicines exist because of fundamental discoveries that were made without regard to practical outcome and with their relevance to therapeutics only appearing decades later. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
22 CFR 120.11 - Public domain.
Code of Federal Regulations, 2012 CFR
2012-04-01
... libraries open to the public or from which the public can obtain documents; (5) Through patents available at... fundamental research in science and engineering at accredited institutions of higher learning in the U.S.... Fundamental research is defined to mean basic and applied research in science and engineering where the...
22 CFR 120.11 - Public domain.
Code of Federal Regulations, 2014 CFR
2014-04-01
... libraries open to the public or from which the public can obtain documents; (5) Through patents available at... fundamental research in science and engineering at accredited institutions of higher learning in the U.S.... Fundamental research is defined to mean basic and applied research in science and engineering where the...
22 CFR 120.11 - Public domain.
Code of Federal Regulations, 2011 CFR
2011-04-01
... libraries open to the public or from which the public can obtain documents; (5) Through patents available at... fundamental research in science and engineering at accredited institutions of higher learning in the U.S.... Fundamental research is defined to mean basic and applied research in science and engineering where the...
22 CFR 120.11 - Public domain.
Code of Federal Regulations, 2013 CFR
2013-04-01
... libraries open to the public or from which the public can obtain documents; (5) Through patents available at... fundamental research in science and engineering at accredited institutions of higher learning in the U.S.... Fundamental research is defined to mean basic and applied research in science and engineering where the...
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlson, Neil; Jibben, Zechariah; Brady, Peter
2017-06-28
Pececillo is a proxy-app for the open source Truchas metal processing code (LA-CC-15-097). It implements many of the physics models used in Truchas: free-surface, incompressible Navier-Stokes fluid dynamics (e.g., water waves); heat transport, material phase change, view factor thermal radiation; species advection-diffusion; quasi-static, elastic/plastic solid mechanics with contact; electomagnetics (Maxwell's equations). The models are simplified versions that retain the fundamental computational complexity of the Truchas models while omitting many non-essential features and modeling capabilities. The purpose is to expose Truchas algorithms in a greatly simplified context where computer science problems related to parallel performance on advanced architectures can be moremore » easily investigated. While Pececillo is capable of performing simulations representative of typical Truchas metal casting, welding, and additive manufacturing simulations, it lacks many of the modeling capabilites needed for real applications.« less
How hierarchical is language use?
Frank, Stefan L.; Bod, Rens; Christiansen, Morten H.
2012-01-01
It is generally assumed that hierarchical phrase structure plays a central role in human language. However, considerations of simplicity and evolutionary continuity suggest that hierarchical structure should not be invoked too hastily. Indeed, recent neurophysiological, behavioural and computational studies show that sequential sentence structure has considerable explanatory power and that hierarchical processing is often not involved. In this paper, we review evidence from the recent literature supporting the hypothesis that sequential structure may be fundamental to the comprehension, production and acquisition of human language. Moreover, we provide a preliminary sketch outlining a non-hierarchical model of language use and discuss its implications and testable predictions. If linguistic phenomena can be explained by sequential rather than hierarchical structure, this will have considerable impact in a wide range of fields, such as linguistics, ethology, cognitive neuroscience, psychology and computer science. PMID:22977157
NASA Astrophysics Data System (ADS)
Horstemeyer, M. F.
This review of multiscale modeling covers a brief history of various multiscale methodologies related to solid materials and the associated experimental influences, the various influence of multiscale modeling on different disciplines, and some examples of multiscale modeling in the design of structural components. Although computational multiscale modeling methodologies have been developed in the late twentieth century, the fundamental notions of multiscale modeling have been around since da Vinci studied different sizes of ropes. The recent rapid growth in multiscale modeling is the result of the confluence of parallel computing power, experimental capabilities to characterize structure-property relations down to the atomic level, and theories that admit multiple length scales. The ubiquitous research that focus on multiscale modeling has broached different disciplines (solid mechanics, fluid mechanics, materials science, physics, mathematics, biological, and chemistry), different regions of the world (most continents), and different length scales (from atoms to autos).
Efficient Variational Quantum Simulator Incorporating Active Error Minimization
NASA Astrophysics Data System (ADS)
Li, Ying; Benjamin, Simon C.
2017-04-01
One of the key applications for quantum computers will be the simulation of other quantum systems that arise in chemistry, materials science, etc., in order to accelerate the process of discovery. It is important to ask the following question: Can this simulation be achieved using near-future quantum processors, of modest size and under imperfect control, or must it await the more distant era of large-scale fault-tolerant quantum computing? Here, we propose a variational method involving closely integrated classical and quantum coprocessors. We presume that all operations in the quantum coprocessor are prone to error. The impact of such errors is minimized by boosting them artificially and then extrapolating to the zero-error case. In comparison to a more conventional optimized Trotterization technique, we find that our protocol is efficient and appears to be fundamentally more robust against error accumulation.
A Research Agenda and Vision for Data Science
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2014-12-01
Big Data has emerged as a first-class citizen in the research community spanning disciplines in the domain sciences - Astronomy is pushing velocity with new ground-based instruments such as the Square Kilometre Array (SKA) and its unprecedented data rates (700 TB/sec!); Earth-science is pushing the boundaries of volume with increasing experiments in the international Intergovernmental Panel on Climate Change (IPCC) and climate modeling and remote sensing communities increasing the size of the total archives into the Exabytes scale; airborne missions from NASA such as the JPL Airborne Snow Observatory (ASO) is increasing both its velocity and decreasing the overall turnaround time required to receive products and to make them available to water managers and decision makers. Proteomics and the computational biology community are sequencing genomes and providing near real time answers to clinicians, researchers, and ultimately to patients, helping to process and understand and create diagnoses. Data complexity is on the rise, and the norm is no longer 100s of metadata attributes, but thousands to hundreds of thousands, including complex interrelationships between data and metadata and knowledge. I published a vision for data science in Nature 2013 that encapsulates four thrust areas and foci that I believe the computer science, Big Data, and data science communities need to attack over the next decade to make fundamental progress in the data volume, velocity and complexity challenges arising from the domain sciences such as those described above. These areas include: (1) rapid and unobtrusive algorithm integration; (2) intelligent and automatic data movement; (3) automated and rapid extraction text, metadata and language from heterogeneous file formats; and (4) participation and people power via open source communities. In this talk I will revisit these four areas and describe current progress; future work and challenges ahead as we move forward in this exciting age of Data Science.
Liang, Jie; Qian, Hong
2010-01-01
Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand “complex behavior” and complexity theory, and from which important biological insight can be gained. PMID:24999297
Liang, Jie; Qian, Hong
2010-01-01
Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.
Model reduction for agent-based social simulation: coarse-graining a civil violence model.
Zou, Yu; Fonoberov, Vladimir A; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G
2012-06-01
Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20).
Model reduction for agent-based social simulation: Coarse-graining a civil violence model
NASA Astrophysics Data System (ADS)
Zou, Yu; Fonoberov, Vladimir A.; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G.
2012-06-01
Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20).
Computational intelligence from AI to BI to NI
NASA Astrophysics Data System (ADS)
Werbos, Paul J.
2015-05-01
This paper gives highlights of the history of the neural network field, stressing the fundamental ideas which have been in play. Early neural network research was motivated mainly by the goals of artificial intelligence (AI) and of functional neuroscience (biological intelligence, BI), but the field almost died due to frustrations articulated in the famous book Perceptrons by Minsky and Papert. When I found a way to overcome the difficulties by 1974, the community mindset was very resistant to change; it was not until 1987/1988 that the field was reborn in a spectacular way, leading to the organized communities now in place. Even then, it took many more years to establish crossdisciplinary research in the types of mathematical neural networks needed to really understand the kind of intelligence we see in the brain, and to address the most demanding engineering applications. Only through a new (albeit short-lived) funding initiative, funding crossdisciplinary teams of systems engineers and neuroscientists, were we able to fund the critical empirical demonstrations which put our old basic principle of "deep learning" firmly on the map in computer science. Progress has rightly been inhibited at times by legitimate concerns about the "Terminator threat" and other possible abuses of technology. This year, at SPIE, in the quantum computing track, we outline the next stage ahead of us in breaking out of the box, again and again, and rising to fundamental challenges and opportunities still ahead of us.
Analytical Computation of the Epidemic Threshold on Temporal Networks
NASA Astrophysics Data System (ADS)
Valdano, Eugenio; Ferreri, Luca; Poletto, Chiara; Colizza, Vittoria
2015-04-01
The time variation of contacts in a networked system may fundamentally alter the properties of spreading processes and affect the condition for large-scale propagation, as encoded in the epidemic threshold. Despite the great interest in the problem for the physics, applied mathematics, computer science, and epidemiology communities, a full theoretical understanding is still missing and currently limited to the cases where the time-scale separation holds between spreading and network dynamics or to specific temporal network models. We consider a Markov chain description of the susceptible-infectious-susceptible process on an arbitrary temporal network. By adopting a multilayer perspective, we develop a general analytical derivation of the epidemic threshold in terms of the spectral radius of a matrix that encodes both network structure and disease dynamics. The accuracy of the approach is confirmed on a set of temporal models and empirical networks and against numerical results. In addition, we explore how the threshold changes when varying the overall time of observation of the temporal network, so as to provide insights on the optimal time window for data collection of empirical temporal networked systems. Our framework is of both fundamental and practical interest, as it offers novel understanding of the interplay between temporal networks and spreading dynamics.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
NASA Astrophysics Data System (ADS)
Tschopp, M. A.; Murdoch, H. A.; Kecskes, L. J.; Darling, K. A.
2014-06-01
It is a new beginning for innovative fundamental and applied science in nanocrystalline materials. Many of the processing and consolidation challenges that have haunted nanocrystalline materials are now more fully understood, opening the doors for bulk nanocrystalline materials and parts to be produced. While challenges remain, recent advances in experimental, computational, and theoretical capability have allowed for bulk specimens that have heretofore been pursued only on a limited basis. This article discusses the methodology for synthesis and consolidation of bulk nanocrystalline materials using mechanical alloying, the alloy development and synthesis process for stabilizing these materials at elevated temperatures, and the physical and mechanical properties of nanocrystalline materials with a focus throughout on nanocrystalline copper and a nanocrystalline Cu-Ta system, consolidated via equal channel angular extrusion, with properties rivaling that of nanocrystalline pure Ta. Moreover, modeling and simulation approaches as well as experimental results for grain growth, grain boundary processes, and deformation mechanisms in nanocrystalline copper are briefly reviewed and discussed. Integrating experiments and computational materials science for synthesizing bulk nanocrystalline materials can bring about the next generation of ultrahigh strength materials for defense and energy applications.
High-fidelity plasma codes for burn physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooley, James; Graziani, Frank; Marinak, Marty
Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental datamore » and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.« less
Role of theory in space science
NASA Technical Reports Server (NTRS)
1983-01-01
The goal of theory is to understand how the fundamental laws of physics laws of physics and chemistry give rise to the features of the universe. It is recommended that NASA establish independent theoretical research programs in planetary sciences and in astrophysics similar to the solar-system plasma-physics theory program, which is characterized by stable, long-term support for theorists in university departments, NASA centers, and other organizations engaged in research in topics relevant to present and future space-derived data. It is recommended that NASA keep these programs under review to full benefit from the resulting research and to assure opportunities for inflow of new ideas and investigators. Also, provisions should be made by NASA for the computing needs of the theorists in the programs. Finally, it is recommended that NASA involve knowledgeable theorists in mission planning activities at all levels, from the formulation of long-term scientific strategies through the planning and operation of specific missions.
NASA Astrophysics Data System (ADS)
Vilotte, J. P.; Atkinson, M.; Spinuso, A.; Rietbrock, A.; Michelini, A.; Igel, H.; Frank, A.; Carpené, M.; Schwichtenberg, H.; Casarotti, E.; Filgueira, R.; Garth, T.; Germünd, A.; Klampanos, I.; Krause, A.; Krischer, L.; Leong, S. H.; Magnoni, F.; Matser, J.; Moguilny, G.
2015-12-01
Seismology addresses both fundamental problems in understanding the Earth's internal wave sources and structures and augmented societal applications, like earthquake and tsunami hazard assessment and risk mitigation; and puts a premium on open-data accessible by the Federated Digital Seismological Networks. The VERCE project, "Virtual Earthquake and seismology Research Community e-science environment in Europe", has initiated a virtual research environment to support complex orchestrated workflows combining state-of-art wave simulation codes and data analysis tools on distributed computing and data infrastructures (DCIs) along with multiple sources of observational data and new capabilities to combine simulation results with observational data. The VERCE Science Gateway provides a view of all the available resources, supporting collaboration with shared data and methods, with data access controls. The mapping to DCIs handles identity management, authority controls, transformations between representations and controls, and access to resources. The framework for computational science that provides simulation codes, like SPECFEM3D, democratizes their use by getting data from multiple sources, managing Earth models and meshes, distilling them as input data, and capturing results with meta-data. The dispel4py data-intensive framework allows for developing data-analysis applications using Python and the ObsPy library, which can be executed on different DCIs. A set of tools allows coupling with seismology and external data services. Provenance driven tools validate results and show relationships between data to facilitate method improvement. Lessons learned from VERCE training lead us to conclude that solid-Earth scientists could make significant progress by using VERCE e-science environment. VERCE has already contributed to the European Plate Observation System (EPOS), and is part of the EPOS implementation phase. Its cross-disciplinary capabilities are being extended for the EPOS implantation phase.
NASA Technical Reports Server (NTRS)
Hall, Justin R.; Hastrup, Rolf C.
1990-01-01
The principal challenges in providing effective deep space navigation, telecommunications, and information management architectures and designs for Mars exploration support are presented. The fundamental objectives are to provide the mission with the means to monitor and control mission elements, obtain science, navigation, and engineering data, compute state vectors and navigate, and to move these data efficiently and automatically between mission nodes for timely analysis and decision making. New requirements are summarized, and related issues and challenges including the robust connectivity for manned and robotic links, are identified. Enabling strategies are discussed, and candidate architectures and driving technologies are described.
A web based tool for storing and visualising data generated within a smart home.
McDonald, H A; Nugent, C D; Moore, G; Finlay, D D; Hallberg, J
2011-01-01
There is a growing need to re-assess the current approaches available to researchers for storing and managing heterogeneous data generated within a smart home environment. In our current work we have developed the homeML Application; a web based tool to support researchers engaged in the area of smart home research as they perform experiments. Within this paper the homeML Application is presented which includes the fundamental components of the homeML Repository and the homeML Toolkit. Results from a usability study conducted by 10 computer science researchers are presented; the initial results of which have been positive.
Education Potential of the National Virtual Observatory
NASA Astrophysics Data System (ADS)
Christian, Carol
2006-12-01
Research in astronomy is blossoming with the availability of sophisticated instrumentation and tools aimed at breakthroughs in our understanding of the physical universe. Researchers can take advantage of the astronomical infrastructure, the National Virtual Observatory (NVO), for their investigations. . As well, data and tools available to the public are increasing through the distributed resources of observatories, academic institutions, computing facilities and educational organizations. Because Astronomy holds the public interest through engaging content and striking a cord with fundamental questions of human interest, it is a perfect context for science and technical education. Through partnerships we are cultivating, the NVO can be tuned for educational purposes.
NASA Astrophysics Data System (ADS)
Hall, Justin R.; Hastrup, Rolf C.
1990-10-01
The principal challenges in providing effective deep space navigation, telecommunications, and information management architectures and designs for Mars exploration support are presented. The fundamental objectives are to provide the mission with the means to monitor and control mission elements, obtain science, navigation, and engineering data, compute state vectors and navigate, and to move these data efficiently and automatically between mission nodes for timely analysis and decision making. New requirements are summarized, and related issues and challenges including the robust connectivity for manned and robotic links, are identified. Enabling strategies are discussed, and candidate architectures and driving technologies are described.
Study on the tumor-induced angiogenesis using mathematical models.
Suzuki, Takashi; Minerva, Dhisa; Nishiyama, Koichi; Koshikawa, Naohiko; Chaplain, Mark Andrew Joseph
2018-01-01
We studied angiogenesis using mathematical models describing the dynamics of tip cells. We reviewed the basic ideas of angiogenesis models and its numerical simulation technique to produce realistic computer graphics images of sprouting angiogenesis. We examined the classical model of Anderson-Chaplain using fundamental concepts of mass transport and chemical reaction with ECM degradation included. We then constructed two types of numerical schemes, model-faithful and model-driven ones, where new techniques of numerical simulation are introduced, such as transient probability, particle velocity, and Boolean variables. © 2017 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.
NASA Astrophysics Data System (ADS)
Karasik, Valeriy; Ryzhii, Viktor; Yurchenko, Stanislav
2014-03-01
The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) Bauman Moscow State Technical University Moscow, Russia, 3-6 June, 2013 The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) was held in Bauman Moscow State Technical University on 3-6 June 2013 and was devoted to modern problems of terahertz optical technologies. RJUS TeraTech 2013 was organized by Bauman Moscow State Technical University in cooperation with Tohoku University (Sendai, Japan) and University of Buffalo (The State University of New York, USA). The Symposium was supported by Bauman Moscow State Technical University (Moscow, Russia) and Russian Foundation for Basic Research (grant number 13-08-06100-g). RJUS TeraTech - 2013 became a foundation for sharing and discussing modern and promising achievements in fundamental and applied problems of terahertz optical technologies, devices based on grapheme and grapheme strictures, condensed matter of different nature. Among participants of RJUS TeraTech - 2013, there were more than 100 researchers and students from different countries. This volume contains proceedings of the 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies'. Valeriy Karasik, Viktor Ryzhii and Stanislav Yurchenko Bauman Moscow State Technical University Symposium chair Anatoliy A Aleksandrov, Rector of BMSTU Symposium co-chair Valeriy E Karasik, Head of the Research and Educational Center 'PHOTONICS AND INFRARED TECHNOLOGY' (Russia) Invited Speakers Taiichi Otsuji, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Akira Satou, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Michael Shur, Electrical, Computer and System Engineering and Physics, Applied Physics, and Astronomy, Rensselaer Polytechnic Institute, NY, USA Natasha Kirova, University Paris-Sud, France Andrei Sergeev, Department of Electrical Engineering, The University of Buffalo, The State University of New Your, Buffalo, NY, USA Magnus Willander, Linkoping University (LIU), Department of Science and Technology, Linkopings, Sweden Dmitry R Khohlov, Physical Faculty, Lomonosov Moscow State University, Russia Vladimir L Vaks, Institute for Physics of Microstructures of Russian Academy of Sciences, Russia
PREFACE: International Conference on Applied Sciences (ICAS2014)
NASA Astrophysics Data System (ADS)
Lemle, Ludovic Dan; Jiang, Yiwen
2015-06-01
The International Conference on Applied Sciences (ICAS2014) took place in Hunedoara, Romania from 2-4 October 2014 at the Engineering Faculty of Hunedoara. The conference takes place alternately in Romania and in P.R. China and is organized by "Politehnica" University of Timisoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the aim to serve as a platform for exchange of information between various areas of applied sciences and to promote the communication between scientists of different nations, countries and continents. The topics of the conference covered a comprehensive spectrum of issues: 1. Economical Sciences 2. Engineering Sciences 3. Fundamental Sciences 4. Medical Sciences The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has the potential for application in economics, defense, medicine, etc. There were nearly 100 registered participants from six countries, and four invited and 56 oral talks were delivered during the two days of the conference. Based on the work presented at the conference, selected papers are included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computer Engineering, and Mathematical Engineering. It is our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in their respective fields.
NASA Astrophysics Data System (ADS)
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
Programs for Fundamentals of Chemistry.
ERIC Educational Resources Information Center
Gallardo, Julio; Delgado, Steven
This document provides computer programs, written in BASIC PLUS, for presenting fundamental or remedial college chemistry students with chemical problems in a computer assisted instructional program. Programs include instructions, a sample run, and 14 separate practice sessions covering: mathematical operations, using decimals, solving…
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Multi-scale computation methods: Their applications in lithium-ion battery research and development
NASA Astrophysics Data System (ADS)
Siqi, Shi; Jian, Gao; Yue, Liu; Yan, Zhao; Qu, Wu; Wangwei, Ju; Chuying, Ouyang; Ruijuan, Xiao
2016-01-01
Based upon advances in theoretical algorithms, modeling and simulations, and computer technologies, the rational design of materials, cells, devices, and packs in the field of lithium-ion batteries is being realized incrementally and will at some point trigger a paradigm revolution by combining calculations and experiments linked by a big shared database, enabling accelerated development of the whole industrial chain. Theory and multi-scale modeling and simulation, as supplements to experimental efforts, can help greatly to close some of the current experimental and technological gaps, as well as predict path-independent properties and help to fundamentally understand path-independent performance in multiple spatial and temporal scales. Project supported by the National Natural Science Foundation of China (Grant Nos. 51372228 and 11234013), the National High Technology Research and Development Program of China (Grant No. 2015AA034201), and Shanghai Pujiang Program, China (Grant No. 14PJ1403900).
Nonlinear Aerodynamics and the Design of Wing Tips
NASA Technical Reports Server (NTRS)
Kroo, Ilan
1991-01-01
The analysis and design of wing tips for fixed wing and rotary wing aircraft still remains part art, part science. Although the design of airfoil sections and basic planform geometry is well developed, the tip regions require more detailed consideration. This is important because of the strong impact of wing tip flow on wing drag; although the tip region constitutes a small portion of the wing, its effect on the drag can be significant. The induced drag of a wing is, for a given lift and speed, inversely proportional to the square of the wing span. Concepts are proposed as a means of reducing drag. Modern computational methods provide a tool for studying these issues in greater detail. The purpose of the current research program is to improve the understanding of the fundamental issues involved in the design of wing tips and to develop the range of computational and experimental tools needed for further study of these ideas.
NASA Astrophysics Data System (ADS)
Volonte, S.
2018-04-01
The Space Science Programme of ESA encompasses three broad areas of investigation, namely solar system science (the Sun, the planets and space plasmas), fundamental physics and space astronomy and astrophysics.
Research Objectives for Human Missions in the Proving Ground of Cis-Lunar Space
NASA Technical Reports Server (NTRS)
Niles, P. B.; Eppler, D. B.; Kennedy, K. J.; Lewis, R.; Spann, J. F.; Sullivan, T. A.
2016-01-01
Beginning in as early as 2023, crewed missions beyond low Earth orbit will begin enabled by the new capabilities of the SLS and Orion vehicles. This will initiate the "Proving Ground" phase of human exploration with Mars as an ultimate destination. The primary goal of the Proving Ground is to demonstrate the capability of suitably long duration spaceflight without need of continuous support from Earth, i.e. become Earth Independent. A major component of the Proving Ground phase is to conduct research activities aimed at accomplishing major objectives selected from a wide variety of disciplines including but not limited to: Astronomy, Heliophysics, Fundamental Physics, Planetary Science, Earth Science, Human Systems, Fundamental Space Biology, Microgravity, and In A major component of the Proving Ground phase is to conduct research activities aimed at accomplishing major objectives selected from a wide variety of disciplines including but not limited to: Astronomy, Heliophysics, Fundamental Physics, Planetary Science, Earth Science, Human Systems, Fundamental Space Biology, Microgravity, and In Situ Resource Utilization. Mapping and prioritizing the most important objectives from these disciplines will provide a strong foundation for establishing the architecture to be utilized in the Proving Ground.
Macroscopic characterisations of Web accessibility
NASA Astrophysics Data System (ADS)
Lopes, Rui; Carriço, Luis
2010-12-01
The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.
Science and technology integration for increased human potential and societal outcomes.
Roco, Mihail C
2004-05-01
Unifying science based on the material unity of nature at the nanoscale provides a new foundation for knowledge, innovation, and integration of technology. Revolutionary and synergistic advances at the interfaces between previously separated fields of science, engineering and areas of relevance are ready to create nano-bio-info-cogno (NBIC) transforming tools. Developments in systems approach, mathematics, and computation in conjunction with NBIC allow us to understand the natural world and scientific research as closely coupled, complex, hierarchical entities. At this unique moment of scientific and technical achievement, improvement of human performance at individual and group levels, as well as development of suitable revolutionary products, becomes possible and these are primary goals for converging new technologies. NBIC addresses long-term advances in key areas of human activity, including working, learning, aging, group interaction, organizations, and human evolution ((Roco and Bainbridge, 2003)). Fundamentally new tools, technologies, and products will be integrated into individual and social human architecture. This introductory chapter of the Annals outlines research and education trends, funding activities, and the potential of development of revolutionary products and services.
The Now Age, New Space, and Transforming the Exploration of Geospace
NASA Astrophysics Data System (ADS)
Paxton, L. J.
2017-12-01
In this talk I will discuss: 1) Changing our description of how and why we do Heliophysics (NASA) and Geospace Science (NSF) research 2) How we can take advantage of the New Space industry capabilities 3) How and why we can use the technology that has begun the transformation of our society into the "Now Age" I will discuss trends that I see that enable, if we have the will, a fundamental revitalization of the science that we aspire to do. I will focus on our opportunities to revolutionize the exploration of geospace (the region below about 1000km) and how that addresses fundamental questions about our place in the universe. Exploration of space, in particular exploration of geospace, is at a cusp - we can either attempt to continue to move forward using the same, tried and true techniques or we can embrace the "Now Age" and the capabilities enabled by the New Space industry to move forward to a fuller understanding of our world's place in the solar system. Heliophysics at NASA and Geospace Science at NSF can be recast as fundamental exploratory basic research that asks and answers questions that everyone can understand. We are in the Now Age because the human race has enabled and embraced a fundamentally different way of accessing information and, potentially gaining knowledge. For the first time, we have the capability to provide essentially all of recorded human knowledge immediately and to anyone - and people want that access "now". Even in the scientific community we expect to be able to see the latest data right now. This is enabled by the internet and ubiquitous connectivity; low cost data storage and memory; fast, low-cost computing; the means to visualize the information; advances in the way we store, catalog and retrieve information; and advances in modeling and simulation. Concomitant with the Now Age, and providing an impetus to do things "now", the New Space industry has enabled low cost access to space and has embraced a vision of human presence in space that goes far beyond anything considered by NASA. To make all of these abstractions concrete, I will describe how we could reimagine a Heliophysics Roadmap space mission, such as Geospace Dynamics Constellation (GDC), by taking advantage of these capabilities. This test case is intended solely as a heuristic example rather than an alternative formulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
Multiuser Collaboration with Networked Mobile Devices
NASA Technical Reports Server (NTRS)
Tso, Kam S.; Tai, Ann T.; Deng, Yong M.; Becks, Paul G.
2006-01-01
In this paper we describe a multiuser collaboration infrastructure that enables multiple mission scientists to remotely and collaboratively interact with visualization and planning software, using wireless networked personal digital assistants(PDAs) and other mobile devices. During ground operations of planetary rover and lander missions, scientists need to meet daily to review downlinked data and plan science activities. For example, scientists use the Science Activity Planner (SAP) in the Mars Exploration Rover (MER) mission to visualize downlinked data and plan rover activities during the science meetings [1]. Computer displays are projected onto large screens in the meeting room to enable the scientists to view and discuss downlinked images and data displayed by SAP and other software applications. However, only one person can interact with the software applications because input to the computer is limited to a single mouse and keyboard. As a result, the scientists have to verbally express their intentions, such as selecting a target at a particular location on the Mars terrain image, to that person in order to interact with the applications. This constrains communication and limits the returns of science planning. Furthermore, ground operations for Mars missions are fundamentally constrained by the short turnaround time for science and engineering teams to process and analyze data, plan the next uplink, generate command sequences, and transmit the uplink to the vehicle [2]. Therefore, improving ground operations is crucial to the success of Mars missions. The multiuser collaboration infrastructure enables users to control software applications remotely and collaboratively using mobile devices. The infrastructure includes (1) human-computer interaction techniques to provide natural, fast, and accurate inputs, (2) a communications protocol to ensure reliable and efficient coordination of the input devices and host computers, (3) an application-independent middleware that maintains the states, sessions, and interactions of individual users of the software applications, (4) an application programming interface to enable tight integration of applications and the middleware. The infrastructure is able to support any software applications running under the Windows or Unix platforms. The resulting technologies not only are applicable to NASA mission operations, but also useful in other situations such as design reviews, brainstorming sessions, and business meetings, as they can benefit from having the participants concurrently interact with the software applications (e.g., presentation applications and CAD design tools) to illustrate their ideas and provide inputs.
NASA Astrophysics Data System (ADS)
Bender, Jason D.
Understanding hypersonic aerodynamics is important for the design of next-generation aerospace vehicles for space exploration, national security, and other applications. Ground-level experimental studies of hypersonic flows are difficult and expensive; thus, computational science plays a crucial role in this field. Computational fluid dynamics (CFD) simulations of extremely high-speed flows require models of chemical and thermal nonequilibrium processes, such as dissociation of diatomic molecules and vibrational energy relaxation. Current models are outdated and inadequate for advanced applications. We describe a multiscale computational study of gas-phase thermochemical processes in hypersonic flows, starting at the atomic scale and building systematically up to the continuum scale. The project was part of a larger effort centered on collaborations between aerospace scientists and computational chemists. We discuss the construction of potential energy surfaces for the N4, N2O2, and O4 systems, focusing especially on the multi-dimensional fitting problem. A new local fitting method named L-IMLS-G2 is presented and compared with a global fitting method. Then, we describe the theory of the quasiclassical trajectory (QCT) approach for modeling molecular collisions. We explain how we implemented the approach in a new parallel code for high-performance computing platforms. Results from billions of QCT simulations of high-energy N2 + N2, N2 + N, and N2 + O2 collisions are reported and analyzed. Reaction rate constants are calculated and sets of reactive trajectories are characterized at both thermal equilibrium and nonequilibrium conditions. The data shed light on fundamental mechanisms of dissociation and exchange reactions -- and their coupling to internal energy transfer processes -- in thermal environments typical of hypersonic flows. We discuss how the outcomes of this investigation and other related studies lay a rigorous foundation for new macroscopic models for hypersonic CFD. This research was supported by the Department of Energy Computational Science Graduate Fellowship and by the Air Force Office of Scientific Research Multidisciplinary University Research Initiative.
Terahertz science and technology of carbon nanomaterials.
Hartmann, R R; Kono, J; Portnoi, M E
2014-08-15
The diverse applications of terahertz (THz) radiation and its importance to fundamental science makes finding ways to generate, manipulate and detect THz radiation one of the key areas of modern applied physics. One approach is to utilize carbon nanomaterials, in particular, single-wall carbon nanotubes and graphene. Their novel optical and electronic properties offer much promise to the field of THz science and technology. This article describes the past, current, and future of THz science and technology of carbon nanotubes and graphene. We will review fundamental studies such as THz dynamic conductivity, THz nonlinearities and ultrafast carrier dynamics as well as THz applications such as THz sources, detectors, modulators, antennas and polarizers.
The Fundamental Neutron Physics Facilities at NIST.
Nico, J S; Arif, M; Dewey, M S; Gentile, T R; Gilliam, D M; Huffman, P R; Jacobson, D L; Thompson, A K
2005-01-01
The program in fundamental neutron physics at the National Institute of Standards and Technology (NIST) began nearly two decades ago. The Neutron Interactions and Dosimetry Group currently maintains four neutron beam lines dedicated to studies of fundamental neutron interactions. The neutrons are provided by the NIST Center for Neutron Research, a national user facility for studies that include condensed matter physics, materials science, nuclear chemistry, and biological science. The beam lines for fundamental physics experiments include a high-intensity polychromatic beam, a 0.496 nm monochromatic beam, a 0.89 nm monochromatic beam, and a neutron interferometer and optics facility. This paper discusses some of the parameters of the beam lines along with brief presentations of some of the experiments performed at the facilities.
The Fundamental Neutron Physics Facilities at NIST
Nico, J. S.; Arif, M.; Dewey, M. S.; Gentile, T. R.; Gilliam, D. M.; Huffman, P. R.; Jacobson, D. L.; Thompson, A. K.
2005-01-01
The program in fundamental neutron physics at the National Institute of Standards and Technology (NIST) began nearly two decades ago. The Neutron Interactions and Dosimetry Group currently maintains four neutron beam lines dedicated to studies of fundamental neutron interactions. The neutrons are provided by the NIST Center for Neutron Research, a national user facility for studies that include condensed matter physics, materials science, nuclear chemistry, and biological science. The beam lines for fundamental physics experiments include a high-intensity polychromatic beam, a 0.496 nm monochromatic beam, a 0.89 nm monochromatic beam, and a neutron interferometer and optics facility. This paper discusses some of the parameters of the beam lines along with brief presentations of some of the experiments performed at the facilities. PMID:27308110
Digital and biological computing in organizations.
Kampfner, Roberto R
2002-01-01
Michael Conrad unveiled many of the fundamental characteristics of biological computing. Underlying the behavioral variability and the adaptability of biological systems are these characteristics, including the ability of biological information processing to exploit quantum features at the atomic level, the powerful 3-D pattern recognition capabilities of macromolecules, the computational efficiency, and the ability to support biological function. Among many other things, Conrad formalized and explicated the underlying principles of biological adaptability, characterized the differences between biological and digital computing in terms of a fundamental tradeoff between adaptability and programmability of information processing, and discussed the challenges of interfacing digital computers and human society. This paper is about the encounter of biological and digital computing. The focus is on the nature of the biological information processing infrastructure of organizations and how it can be extended effectively with digital computing. In order to achieve this goal effectively, however, we need to embed properly digital computing into the information processing aspects of human and social behavior and intelligence, which are fundamentally biological. Conrad's legacy provides a firm, strong, and inspiring foundation for this endeavor.
Earth Sciences Division Research Summaries 2006-2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
DePaolo, Donald; DePaolo, Donald
2008-07-21
Research in earth and atmospheric sciences has become increasingly important in light of the energy, climate change, and other environmental issues facing the United States and the world. The development of new energy resources other than fossil hydrocarbons, the safe disposal of nuclear waste and greenhouse gases, and a detailed understanding of the climatic consequences of our energy choices are all critical to meeting energy needs while ensuring environmental safety. The cleanup of underground contamination and the preservation and management of water supplies continue to provide challenges, as they will for generations into the future. To address the critical energymore » and environmental issues requires continuing advances in our knowledge of Earth systems and our ability to translate that knowledge into new technologies. The fundamental Earth science research common to energy and environmental issues largely involves the physics, chemistry, and biology of fluids in and on the Earth. To manage Earth fluids requires the ability to understand their properties and behavior at the most fundamental molecular level, as well as prediction, characterization, imaging, and manipulation of those fluids and their behavior in real Earth reservoirs. The broad range of disciplinary expertise, the huge range of spatial and time scales, and the need to integrate theoretical, computational, laboratory and field research, represent both the challenge and the excitement of Earth science research. The Earth Sciences Division (ESD) of the Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab) is committed to addressing the key scientific and technical challenges that are needed to secure our energy future in an environmentally responsibly way. Our staff of over 200 scientists, UC Berkeley faculty, support staff and guests perform world-acclaimed fundamental research in hydrogeology and reservoir engineering, geophysics and geomechanics, geochemistry, microbial ecology, climate systems, and environmental engineering. Building on this scientific foundation, we also perform applied earth science research and technology development to support DOE in a number of its program areas. We currently organize our efforts in the following Division Programs: Fundamental and Exploratory Research--fundamental research in geochemistry, geophysics, and hydrology to provide a basis for new and improved energy and environmental technologies; Climate and Carbon Sciences--carbon cycling in the terrestrial biosphere and oceans, and global and regional climate modeling, are the cornerstones of a major developing divisional research thrust related to understanding and mitigating the effects of increased greenhouse gas concentrations in the atmosphere; Energy Resources--collaborative projects with industry to develop or improve technologies for the exploration and production of oil, gas, and geothermal reservoirs, and for the development of bioenergy; Environmental Remediation and Water Resources--innovative technologies for locating, containing, and remediating metals, radionuclides, chlorinated solvents, and energy-related contaminants in soils and groundwaters; Geologic Carbon Sequestration--development and testing of methods for introducing carbon dioxide to subsurface geologic reservoirs, and predicting and monitoring its subsequent migration; and Nuclear Waste and Energy--theoretical, experimental, and simulation studies of the unsaturated zone at Yucca Mountain, Nevada. These programs draw from each of ESD's disciplinary departments: Climate Science, Ecology, Geochemistry, Geophysics, and Hydrogeology. Short descriptions of these departments are provided as introductory material. In this document, we present summaries of selected current research projects. While it is not a complete accounting, the projects described here are representative of the nature and breadth of the ESD research effort. We are proud of our scientific accomplishments and we hope that you will find this material useful and exciting. A list of publications for the period from January 2006 to June 2007, along with a listing of our personnel, are also appended. Any comments on our research are appreciated and can be sent to me personally.« less
Assessing the Science Knowledge of University Students: Perils, Pitfalls and Possibilities
ERIC Educational Resources Information Center
Jones, Susan M.
2014-01-01
Science content knowledge is internationally regarded as a fundamentally important learning outcome for graduates of bachelor level science degrees: the Science Threshold Learning Outcomes (TLOs) recently adopted in Australia as a nationally agreed framework include "Science Knowledge" as TLO 2. Science knowledge is commonly assessed…
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Data Science and its Relationship to Big Data and Data-Driven Decision Making.
Provost, Foster; Fawcett, Tom
2013-03-01
Companies have realized they need to hire data scientists, academic institutions are scrambling to put together data-science programs, and publications are touting data science as a hot-even "sexy"-career choice. However, there is confusion about what exactly data science is, and this confusion could lead to disillusionment as the concept diffuses into meaningless buzz. In this article, we argue that there are good reasons why it has been hard to pin down exactly what is data science. One reason is that data science is intricately intertwined with other important concepts also of growing importance, such as big data and data-driven decision making. Another reason is the natural tendency to associate what a practitioner does with the definition of the practitioner's field; this can result in overlooking the fundamentals of the field. We believe that trying to define the boundaries of data science precisely is not of the utmost importance. We can debate the boundaries of the field in an academic setting, but in order for data science to serve business effectively, it is important (i) to understand its relationships to other important related concepts, and (ii) to begin to identify the fundamental principles underlying data science. Once we embrace (ii), we can much better understand and explain exactly what data science has to offer. Furthermore, only once we embrace (ii) should we be comfortable calling it data science. In this article, we present a perspective that addresses all these concepts. We close by offering, as examples, a partial list of fundamental principles underlying data science.
ERIC Educational Resources Information Center
Bragesjo, Fredrik; Elzinga, Aant; Kasperowski, Dick
2012-01-01
The objective of this paper is to balance two major conceptual tendencies in science policy studies, continuity and discontinuity theory. While the latter argue for fundamental and distinct changes in science policy in the late 20th century, continuity theorists show how changes do occur but not as abrupt and fundamental as discontinuity theorists…
Synthesis and thermoelectric properties of Nd-single filled p-type skutterudites
NASA Astrophysics Data System (ADS)
Wu, Hong; Shaheen, Nusrat; Yang, Heng-Quan; Peng, Kun-Ling; Shen, Xing-Chen; Wang, Guo-Yu; Lu, Xu; Zhou, Xiao-Yuan
2018-04-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11674040, 11404044, 51472036, 51672270, and 51401202), the Fundamental Research Funds for the Central Universities (Grant No. 106112016CDJZR308808), the 100 Talent Program of the Chinese Academy of Sciences (Grant No. 2013-46), and the Project for Fundamental and Frontier Research in Chongqing, China (Grant No. CSTC2015JCYJBX0026).
Development of EarthCube Governance: An Agile Approach
NASA Astrophysics Data System (ADS)
Pearthree, G.; Allison, M. L.; Patten, K.
2013-12-01
Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental evaluators from the social sciences embedded in the project provide real-time review and adjustments. While a large number of agencies and organizations have agreed to participate, in order to ensure an open and inclusive process, community selected leaders yet to be identified will play key roles through an Assembly Advisory Council. Once consensus is reached on a governing framework, a community-selected demonstration governance pilot will help facilitate community convergence on system design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duignan, Timothy T.; Baer, Marcel D.; Schenter, Gregory K.
Determining the solvation free energies of single ions in water is one of the most fundamental problems in physical chemistry and yet many unresolved questions remain. In particular, the ability to decompose the solvation free energy into simple and intuitive contributions will have important implications for coarse grained models of electrolyte solution. Here, we provide rigorous definitions of the various types of single ion solvation free energies based on different simulation protocols. We calculate solvation free energies of charged hard spheres using density functional theory interaction potentials with molecular dynamics simulation (DFT-MD) and isolate the effects of charge and cavitation,more » comparing to the Born (linear response) model. We show that using uncorrected Ewald summation leads to highly unphysical values for the solvation free energy and that charging free energies for cations are approximately linear as a function of charge but that there is a small non-linearity for small anions. The charge hydration asymmetry (CHA) for hard spheres, determined with quantum mechanics, is much larger than for the analogous real ions. This suggests that real ions, particularly anions, are significantly more complex than simple charged hard spheres, a commonly employed representation. We would like to thank Thomas Beck, Shawn Kathmann, Richard Remsing and John Weeks for helpful discussions. Computing resources were generously allocated by PNNL's Institutional Computing program. This research also used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. TTD, GKS, and CJM were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). PNNL is a multi-program national laboratory operated by Battelle for the U.S. Department of Energy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, K; Jha, S; Klimentov, A
2016-01-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full production for the ATLAS experiment since September 2015. We will present our current accomplishments with running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.« less
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
15 CFR 734.8 - Information resulting from fundamental research.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 15 Commerce and Foreign Trade 2 2012-01-01 2012-01-01 false Information resulting from fundamental... OF THE EXPORT ADMINISTRATION REGULATIONS § 734.8 Information resulting from fundamental research. (a... applied research in science and engineering, where the resulting information is ordinarily published and...
15 CFR 734.8 - Information resulting from fundamental research.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 15 Commerce and Foreign Trade 2 2014-01-01 2014-01-01 false Information resulting from fundamental... OF THE EXPORT ADMINISTRATION REGULATIONS § 734.8 Information resulting from fundamental research. (a... applied research in science and engineering, where the resulting information is ordinarily published and...
Creationism as Science: What Every Teacher-Scientist Should Know.
ERIC Educational Resources Information Center
Gatzke, Ken W.
1985-01-01
Addresses philosophical problems of the evolution/creationism debate (including underlying assumptions of creationism and nature of science), suggesting that creationism cannot be presented as science in science courses because it fails to qualify as a science. Prediction and explanation, absolute creationism, and a fundamental difficulty in…
Fixing convergence of Gaussian belief propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jason K; Bickson, Danny; Dolev, Danny
Gaussian belief propagation (GaBP) is an iterative message-passing algorithm for inference in Gaussian graphical models. It is known that when GaBP converges it converges to the correct MAP estimate of the Gaussian random vector and simple sufficient conditions for its convergence have been established. In this paper we develop a double-loop algorithm for forcing convergence of GaBP. Our method computes the correct MAP estimate even in cases where standard GaBP would not have converged. We further extend this construction to compute least-squares solutions of over-constrained linear systems. We believe that our construction has numerous applications, since the GaBP algorithm ismore » linked to solution of linear systems of equations, which is a fundamental problem in computer science and engineering. As a case study, we discuss the linear detection problem. We show that using our new construction, we are able to force convergence of Montanari's linear detection algorithm, in cases where it would originally fail. As a consequence, we are able to increase significantly the number of users that can transmit concurrently.« less
On Parallel Push-Relabel based Algorithms for Bipartite Maximum Matching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langguth, Johannes; Azad, Md Ariful; Halappanavar, Mahantesh
2014-07-01
We study multithreaded push-relabel based algorithms for computing maximum cardinality matching in bipartite graphs. Matching is a fundamental combinatorial (graph) problem with applications in a wide variety of problems in science and engineering. We are motivated by its use in the context of sparse linear solvers for computing maximum transversal of a matrix. We implement and test our algorithms on several multi-socket multicore systems and compare their performance to state-of-the-art augmenting path-based serial and parallel algorithms using a testset comprised of a wide range of real-world instances. Building on several heuristics for enhancing performance, we demonstrate good scaling for themore » parallel push-relabel algorithm. We show that it is comparable to the best augmenting path-based algorithms for bipartite matching. To the best of our knowledge, this is the first extensive study of multithreaded push-relabel based algorithms. In addition to a direct impact on the applications using matching, the proposed algorithmic techniques can be extended to preflow-push based algorithms for computing maximum flow in graphs.« less
Statistical benchmark for BosonSampling
NASA Astrophysics Data System (ADS)
Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas
2016-03-01
Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.
Evanescent wave fluorescence biosensors: Advances of the last decade
Taitt, Chris Rowe; Anderson, George P.; Ligler, Frances S.
2015-01-01
Biosensor development has been a highly dynamic field of research and has progressed rapidly over the past two decades. The advances have accompanied the breakthroughs in molecular biology, nanomaterial sciences, and most importantly computers and electronics. The subfield of evanescent wave fluorescence biosensors has also matured dramatically during this time. Fundamentally, this review builds on our earlier 2005 review. While a brief mention of seminal early work will be included, this current review will focus on new technological developments as well as technology commercialized in just the last decade. Evanescent wave biosensors have found a wide array applications ranging from clinical diagnostics to biodefense to food testing; advances in those applications and more are described herein. PMID:26232145
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
NASA Astrophysics Data System (ADS)
Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.
2016-10-01
The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
Engagement as a Threshold Concept for Science Education and Science Communication
ERIC Educational Resources Information Center
McKinnon, Merryn; Vos, Judith
2015-01-01
Science communication and science education have the same overarching aim--to engage their audiences in science--and both disciplines face similar challenges in achieving this aim. Knowing how to effectively engage their "audiences" is fundamental to the success of both. Both disciplines have well-developed research fields identifying…
Wichchukit, Sukanya; O'Mahony, Michael
2010-01-01
This article reviews a beneficial effect of technology transfer from Electrical Engineering to Food Sensory Science. Specifically, it reviews the recent adoption in Food Sensory Science of the receiver operating characteristic (ROC) curve, a tool that is incorporated in the theory of signal detection. Its use allows the information processing that takes place in the brain during sensory difference testing to be studied and understood. The review deals with how Signal Detection Theory, also called Thurstonian modeling, led to the adoption of a more sophisticated way of analyzing the data from sensory difference tests, by introducing the signal-to-noise ratio, d', as a fundamental measure of perceived small sensory differences. Generally, the method of computation of d' is a simple matter for some of the better known difference tests like the triangle, duo-trio and 2-AFC. However, there are occasions when these tests are not appropriate and other tests like the same-different and the A Not-A test are more suitable. Yet, for these, it is necessary to understand how the brain processes information during the test before d' can be computed. It is for this task that the ROC curve has a particular use. © 2010 Institute of Food Technologists®
Electron Correlation in Oxygen Vacancy in SrTiO3
NASA Astrophysics Data System (ADS)
Lin, Chungwei; Demkov, Alexander A.
2014-03-01
Oxygen vacancies are an important type of defect in transition metal oxides. In SrTiO3 they are believed to be the main donors in an otherwise intrinsic crystal. At the same time, a relatively deep gap state associated with the vacancy is widely reported. To explain this inconsistency we investigate the effect of electron correlation in an oxygen vacancy (OV) in SrTiO3. When taking correlation into account, we find that the OV-induced localized level can at most trap one electron, while the second electron occupies the conduction band. Our results offer a natural explanation of how the OV in SrTiO3 can produce a deep in-gap level (about 1 eV below the conduction band bottom) in photoemission, and at the same time be an electron donor. Our analysis implies an OV in SrTiO3 should be fundamentally regarded as a magnetic impurity, whose deep level is always partially occupied due to the strong Coulomb repulsion. An OV-based Anderson impurity model is derived, and its implications are discussed. This work was supported by Scientific Discovery through Advanced Computing (SciDAC) program funded by U.S. Department of Energy, Office of Science, Advanced Scientific Computing Research and Basic Energy Sciences under award number DESC0008877.
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
NASA Astrophysics Data System (ADS)
Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.
2015-05-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberto, J.; Diaz de la Rubia, T.; Gibala, R.
2006-10-01
The global utilization of nuclear energy has come a long way from its humble beginnings in the first sustained nuclear reaction at the University of Chicago in 1942. Today, there are over 440 nuclear reactors in 31 countries producing approximately 16% of the electrical energy used worldwide. In the United States, 104 nuclear reactors currently provide 19% of electrical energy used nationally. The International Atomic Energy Agency projects significant growth in the utilization of nuclear power over the next several decades due to increasing demand for energy and environmental concerns related to emissions from fossil plants. There are 28 newmore » nuclear plants currently under construction including 10 in China, 8 in India, and 4 in Russia. In the United States, there have been notifications to the Nuclear Regulatory Commission of intentions to apply for combined construction and operating licenses for 27 new units over the next decade. The projected growth in nuclear power has focused increasing attention on issues related to the permanent disposal of nuclear waste, the proliferation of nuclear weapons technologies and materials, and the sustainability of a once-through nuclear fuel cycle. In addition, the effective utilization of nuclear power will require continued improvements in nuclear technology, particularly related to safety and efficiency. In all of these areas, the performance of materials and chemical processes under extreme conditions is a limiting factor. The related basic research challenges represent some of the most demanding tests of our fundamental understanding of materials science and chemistry, and they provide significant opportunities for advancing basic science with broad impacts for nuclear reactor materials, fuels, waste forms, and separations techniques. Of particular importance is the role that new nanoscale characterization and computational tools can play in addressing these challenges. These tools, which include DOE synchrotron X-ray sources, neutron sources, nanoscale science research centers, and supercomputers, offer the opportunity to transform and accelerate the fundamental materials and chemical sciences that underpin technology development for advanced nuclear energy systems. The fundamental challenge is to understand and control chemical and physical phenomena in multi-component systems from femto-seconds to millennia, at temperatures to 1000?C, and for radiation doses to hundreds of displacements per atom (dpa). This is a scientific challenge of enormous proportions, with broad implications in the materials science and chemistry of complex systems. New understanding is required for microstructural evolution and phase stability under relevant chemical and physical conditions, chemistry and structural evolution at interfaces, chemical behavior of actinide and fission-product solutions, and nuclear and thermomechanical phenomena in fuels and waste forms. First-principles approaches are needed to describe f-electron systems, design molecules for separations, and explain materials failure mechanisms. Nanoscale synthesis and characterization methods are needed to understand and design materials and interfaces with radiation, temperature, and corrosion resistance. Dynamical measurements are required to understand fundamental physical and chemical phenomena. New multiscale approaches are needed to integrate this knowledge into accurate models of relevant phenomena and complex systems across multiple length and time scales.« less
DePaolo, Donald J. (Director, Center for Nanoscale Control of Geologic CO2); NCGC Staff
2017-12-09
'Carbon in Underland' was submitted by the Center for Nanoscale Control of Geologic CO2 (NCGC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its 'entertaining animation and engaging explanations of carbon sequestration'. NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from seven institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO{sub 2} is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO{sub 2}. Research topics are: bio-inspired, CO{sub 2} (store), greenhouse gas, and interfacial characterization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Jeff
"Carbon in Underland" was submitted by the Center for Nanoscale Controls on Geologic CO2 (NCGC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its "entertaining animation and engaging explanations of carbon sequestration". NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from sevenmore » institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO2 is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO2. Research topics are: bio-inspired, CO2 (store), greenhouse gas, and interfacial characterization.« less
A Fundamental Methodology for Designing Management Information Systems for Schools.
ERIC Educational Resources Information Center
Visscher, Adrie J.
Computer-assisted school information systems (SISs) are developed and used worldwide; however, the literature on strategies for their design and development is lacking. This paper presents the features of a fundamental approach to systems design that proved to be successful when developing SCHOLIS, a computer-assisted SIS for Dutch secondary…
Future of fundamental discovery in US biomedical research
Levitt, Michael; Levitt, Jonathan M.
2017-01-01
Young researchers are crucially important for basic science as they make unexpected, fundamental discoveries. Since 1982, we find a steady drop in the number of grant-eligible basic-science faculty [principal investigators (PIs)] younger than 46. This fall occurred over a 32-y period when inflation-corrected congressional funds for NIH almost tripled. During this time, the PI success ratio (fraction of basic-science PIs who are R01 grantees) dropped for younger PIs (below 46) and increased for older PIs (above 55). This age-related bias seems to have caused the steady drop in the number of young basic-science PIs and could reduce future US discoveries in fundamental biomedical science. The NIH recognized this bias in its 2008 early-stage investigator (ESI) policy to fund young PIs at higher rates. We show this policy is working and recommend that it be enhanced by using better data. Together with the National Institute of General Medical Sciences (NIGMS) Maximizing Investigators’ Research Award (MIRA) program to reward senior PIs with research time in exchange for less funding, this may reverse a decades-long trend of more money going to older PIs. To prepare young scientists for increased demand, additional resources should be devoted to transitional postdoctoral fellowships already offered by NIH. PMID:28584129
A Financial Technology Entrepreneurship Program for Computer Science Students
ERIC Educational Resources Information Center
Lawler, James P.; Joseph, Anthony
2011-01-01
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
Fairness in Knowing: Science Communication and Epistemic Justice.
Medvecky, Fabien
2017-09-22
Science communication, as a field and as a practice, is fundamentally about knowledge distribution; it is about the access to, and the sharing of knowledge. All distribution (science communication included) brings with it issues of ethics and justice. Indeed, whether science communicators acknowledge it or not, they get to decide both which knowledge is shared (by choosing which topic is communicated), and who gets access to this knowledge (by choosing which audience it is presented to). As a result, the decisions of science communicators have important implications for epistemic justice: how knowledge is distributed fairly and equitably. This paper presents an overview of issues related to epistemic justice for science communication, and argues that there are two quite distinct ways in which science communicators can be just (or unjust) in the way they distribute knowledge. Both of these paths will be considered before concluding that, at least on one of these accounts, science communication as a field and as a practice is fundamentally epistemically unjust. Possible ways to redress this injustice are suggested.
Citizen(s') Science. A Response to "The Future of Citizen Science"
ERIC Educational Resources Information Center
Calabrese Barton, Angela M.
2012-01-01
Citizen science is fundamentally about participation within and for communities. Attempts to merge citizen science with schooling must call not only for a democratization of schooling and science but also for the democratization of the ways in which science is taken up by, with, and for citizen participants. Using this stance, along with critical…
Talking Science: Language and Learning in Science Classrooms
ERIC Educational Resources Information Center
Roth, Wolff-Michael
2005-01-01
This book is about the fundamental nature of talk in school science. Language as a formal system provides resources for conducting everyday affairs, including the doing of science. While writing science is one aspect, talking science may in fact constitute a much more important means by which people navigate and know the world--the very medium…
ERIC Educational Resources Information Center
Menekse, Muhsin
2015-01-01
While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…
Analysis and logical modeling of biological signaling transduction networks
NASA Astrophysics Data System (ADS)
Sun, Zhongyao
The study of network theory and its application span across a multitude of seemingly disparate fields of science and technology: computer science, biology, social science, linguistics, etc. It is the intrinsic similarities embedded in the entities and the way they interact with one another in these systems that link them together. In this dissertation, I present from both the aspect of theoretical analysis and the aspect of application three projects, which primarily focus on signal transduction networks in biology. In these projects, I assembled a network model through extensively perusing literature, performed model-based simulations and validation, analyzed network topology, and proposed a novel network measure. The application of network modeling to the system of stomatal opening in plants revealed a fundamental question about the process that has been left unanswered in decades. The novel measure of the redundancy of signal transduction networks with Boolean dynamics by calculating its maximum node-independent elementary signaling mode set accurately predicts the effect of single node knockout in such signaling processes. The three projects as an organic whole advance the understanding of a real system as well as the behavior of such network models, giving me an opportunity to take a glimpse at the dazzling facets of the immense world of network science.
A conceptual framework to support exposure science research ...
While knowledge of exposure is fundamental to assessing and mitigating risks, exposure information has been costly and difficult to generate. Driven by major scientific advances in analytical methods, biomonitoring, computational tools, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition that allows it to be more agile, predictive, and data- and knowledge-driven. A necessary element of this evolved paradigm is an organizational and predictive framework for exposure science that furthers the application of systems-based approaches. To enable such systems-based approaches, we proposed the Aggregate Exposure Pathway (AEP) concept to organize data and information emerging from an invigorated and expanding field of exposure science. The AEP framework is a layered structure that describes the elements of an exposure pathway, as well as the relationship between those elements. The basic building blocks of an AEP adopt the naming conventions used for Adverse Outcome Pathways (AOPs): Key Events (KEs) to describe the measurable, obligate steps through the AEP; and Key Event Relationships (KERs) describe the linkages between KEs. Importantly, the AEP offers an intuitive approach to organize exposure information from sources to internal site of action, setting the stage for predicting stressor concentrations at an internal target site. These predicted concentrations can help inform the r
NASA Technical Reports Server (NTRS)
Edgerton, V. R.; Roy, R. R.; Hodgson, J. A.; Day, M. K.; Weiss, J.; Harkema, S. J.; Dobkin, B.; Garfinkel, A.; Konigsberg, E.; Koslovskaya, I.
2000-01-01
Space programs support experimental investigations related to the unique environment of space and to the technological developments from many disciplines of both science and engineering that contribute to space studies. Furthermore, interactions between scientists, engineers and administrators, that are necessary for the success of any science mission in space, promote interdiscipline communication, understanding and interests which extend well beyond a specific mission. NASA-catalyzed collaborations have benefited the spinal cord rehabilitation program at UCLA in fundamental science and in the application of expertise and technologies originally developed for the space program. Examples of these benefits include: (1) better understanding of the role of load in maintaining healthy muscle and motor function, resulting in a spinal cord injury (SCI) rehabilitation program based on muscle/limb loading; (2) investigation of a potentially novel growth factor affected by spaceflight which may help regulate muscle mass; (3) development of implantable sensors, electronics and software to monitor and analyze long-term muscle activity in unrestrained subjects; (4) development of hardware to assist therapies applied to SCI patients; and (5) development of computer models to simulate stepping which will be used to investigate the effects of neurological deficits (muscle weakness or inappropriate activation) and to evaluate therapies to correct these deficiencies.
NASA Astrophysics Data System (ADS)
Metaxa, M.
Basic education is fundamental to higher education and scientific and technological literacy. We can confront the widespread adult ignorance and apathy about science and technology. Astronomy, an interdisciplinary science, enhances students' interest and overcomes educational problems. Three years ago, we developed astronomy education in these ways: 1. Summer School for School Students. (50 students from Athens came to the first Summer School in Astrophysics at the National Observatory, September 2-5, 1996, for lectures by professional astronomers and to be familiarized with observatory instruments. 2. Introducing Students to Research. (This teaches students more about science so they are more confident about it. Our students have won top prizes in European research contests for their studies of objects on Schmidt plates and computations on PCs.) 3. Hands-on Activities. (Very important because they bring students close to their natural environment. Activities are: variable-star observations (AAVSO), Eratosthenes project, solar-eclipse, sunspot and comet studies. 4. Contact with Professional Astronomers and Institutes. (These help students reach their social environment and motivate them as "science carriers". We try to make contacts at astronomical events, and through visits to appropriate institutions.) 5. Internet Programs. (Students learn about and familiarize themselves with their technological environment.) 6. Laboratory exercises. (Students should do science, not just learn about it We introduced the following lab. exercises: supernova remnants, galaxy classification, both from Schmidt plates, celestial sphere.
Probability for Weather and Climate
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.
Probing the structure of multi-center molecules with odd–even high harmonics
NASA Astrophysics Data System (ADS)
Su, Ning; Yu, Shujuan; Li, Weiyan; Yang, Shiping; Chen, Yanjun
2018-05-01
Not Available Project supported by the National Natural Science Foundation of China (Grants No. 91750111), the Youth Foundation of Hebei Province Education Department, China (Grant No. QN2017028), the Fundamental Research Funds for Hebei GEO University, China (Grant No. BQ2017047), the Natural Science Foundation of Hebei Province, China (Grant No. A2015205161), and the Fundamental Research Funds for the Central Universities, China (Grant No. SNNU.GK201801009).
Computer Science | Classification | College of Engineering & Applied
EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer
Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?
ERIC Educational Resources Information Center
Schrock, John Richard
1984-01-01
Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…
NASA Astrophysics Data System (ADS)
Crouch, Catherine H.; Heller, Kenneth
2014-05-01
We describe restructuring the introductory physics for life science students (IPLS) course to better support these students in using physics to understand their chosen fields. Our courses teach physics using biologically rich contexts. Specifically, we use examples in which fundamental physics contributes significantly to understanding a biological system to make explicit the value of physics to the life sciences. This requires selecting the course content to reflect the topics most relevant to biology while maintaining the fundamental disciplinary structure of physics. In addition to stressing the importance of the fundamental principles of physics, an important goal is developing students' quantitative and problem solving skills. Our guiding pedagogical framework is the cognitive apprenticeship model, in which learning occurs most effectively when students can articulate why what they are learning matters to them. In this article, we describe our courses, summarize initial assessment data, and identify needs for future research.
Web-based description of the space radiation environment using the Bethe-Bloch model
NASA Astrophysics Data System (ADS)
Cazzola, Emanuele; Calders, Stijn; Lapenta, Giovanni
2016-01-01
Space weather is a rapidly growing area of research not only in scientific and engineering applications but also in physics education and in the interest of the public. We focus especially on space radiation and its impact on space exploration. The topic is highly interdisciplinary, bringing together fundamental concepts of nuclear physics with aspects of radiation protection and space science. We give a new approach to presenting the topic by developing a web-based application that combines some of the fundamental concepts from these two fields into a single tool that can be used in the context of advanced secondary or undergraduate university education. We present DREADCode, an outreach or teaching tool to rapidly assess the current conditions of the radiation field in space. DREADCode uses the available data feeds from a number of ongoing space missions (ACE, GOES-13, GOES-15) to produce a first order approximation of the radiation dose an astronaut would receive during a mission of exploration in deep space (i.e. far from the Earth’s shielding magnetic field and from the radiation belts). DREADCode is based on an easy-to-use GUI interface available online from the European Space Weather Portal (www.spaceweather.eu/dreadcode). The core of the radiation transport computation to produce the radiation dose from the observed fluence of radiation observed by the spacecraft fleet considered is based on a relatively simple approximation: the Bethe-Bloch equation. DREADCode also assumes a simplified geometry and material configuration for the shields used to compute the dose. The approach is approximate and sacrifices some important physics on the altar of rapid execution time, which allows a real-time operation scenario. There is no intention here to produce an operational tool for use in space science and engineering. Rather, we present an educational tool at undergraduate level that uses modern web-based and programming methods to learn some of the most important concepts in the application of radiation protection to space weather problems.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
...: Notice. SUMMARY: The National Institute of Environmental Health Sciences (NIEHS), a research institute of... translation. Research translation fosters the movement of fundamental science toward a useable end-product. It... Innovation Promote transdisciplinary science. SRP firmly supports transdisciplinary research--the synthesis...
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
78 FR 10180 - Annual Computational Science Symposium; Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
Chemistry and Materials Science progress report, FY 1994. Revision 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-01-01
Thrust areas of the weapons-supporting research include surface science, fundamentals of the physics and processing of metals, energetic materials, etc. The laboratory directed R and D include director`s initiatives, individual projects, and transactinium science studies.
ERIC Educational Resources Information Center
Stern, Luli; Roseman, Jo Ellen
2004-01-01
The transfer of matter and energy from one organism to another and between organisms and their physical setting is a fundamental concept in life science. Not surprisingly, this concept is common to the "Benchmarks for Science Literacy" (American Association for the Advancement of Science, [1993]), the "National Science Education Standards"…
Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations
ERIC Educational Resources Information Center
Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa
2013-01-01
The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
New Pathways for Teaching Chemistry: Reflective Judgment in Science.
ERIC Educational Resources Information Center
Finster, David C.
1992-01-01
The reflective judgment model offers a rich context for analysis of science and science teaching. It provides deeper understanding of the scientific process and its critical thinking and reveals fundamental connections between science and the other liberal arts. Classroom techniques from a college chemistry course illustrate the utility of the…
NASA Astrophysics Data System (ADS)
Washington, W. M.
2010-12-01
The development of climate and earth system models has been regarded primarily as the making of scientific tools to study the complex nature of the Earth’s climate. These models have a long history starting with very simple physical models based on fundamental physics in the 1960s and over time they have become much more complex with atmospheric, ocean, sea ice, land/vegetation, biogeochemical, glacial and ecological components. The policy use aspects of these models did not start in the 1960s and 1970s as decision making tools but were used to answer fundamental scientific questions such as what happens when the atmospheric carbon dioxide concentration increases or is doubled. They gave insights into the various interactions and were extensively compared with observations. It was realized that models of the earlier time periods could only give first order answers to many of the fundamental policy questions. As societal concerns about climate change rose, the policy questions of anthropogenic climate change became better defined; they were mostly concerned with the climate impacts of increasing greenhouse gases, aerosols, and land cover change. In the late 1980s, the United Nations set up the Intergovernmental Panel on Climate Change to perform assessments of the published literature. Thus, the development of climate and Earth system models became intimately linked to the need to not only improve our scientific understanding but also answering fundamental policy questions. In order to meet this challenge, the models became more complex and realistic so that they could address these policy oriented science questions such as rising sea level. The presentation will discuss the past and future development of global climate and earth system models for science and policy purposes. Also to be discussed is their interactions with economic integrated assessment models, regional and specialized models such as river transport or ecological components. As an example of one development pathway, the NSF/Department of Energy supported Community Climate System and Earth System Models will be featured in the presentation. Computational challenges will also part of the discussion.
High school computer science education paves the way for higher education: the Israeli case
NASA Astrophysics Data System (ADS)
Armoni, Michal; Gal-Ezer, Judith
2014-07-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.
Gupta, Rajiv; Jones, Stephen E; Mooyaart, Eline A Q; Pomerantz, Stuart R
2006-06-01
The development of multidetector row computed tomography (MDCT) now permits visualization of the entire vascular tree that is relevant for the management of stroke within 15 seconds. Advances in MDCT have brought computed tomography angiography (CTA) to the frontline in evaluation of stroke. CTA is a rapid and noninvasive modality for evaluating the neurovasculature. This article describes the role of CTA in the management of stroke. Fundamentals of contrast delivery, common pathologic findings, artifacts, and pitfalls in CTA interpretation are discussed.
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
Metamaterial, plasmonic and nanophotonic devices
NASA Astrophysics Data System (ADS)
Monticone, Francesco; Alù, Andrea
2017-03-01
The field of metamaterials has opened landscapes of possibilities in basic science, and a paradigm shift in the way we think about and design emergent material properties. In many scenarios, metamaterial concepts have helped overcome long-held scientific challenges, such as the absence of optical magnetism and the limits imposed by diffraction in optical imaging. As the potential of metamaterials, as well as their limitations, become clearer, these advances in basic science have started to make an impact on several applications in different areas, with far-reaching implications for many scientific and engineering fields. At optical frequencies, the alliance of metamaterials with the fields of plasmonics and nanophotonics can further advance the possibility of controlling light propagation, radiation, localization and scattering in unprecedented ways. In this review article, we discuss the recent progress in the field of metamaterials, with particular focus on how fundamental advances in this field are enabling a new generation of metamaterial, plasmonic and nanophotonic devices. Relevant examples include optical nanocircuits and nanoantennas, invisibility cloaks, superscatterers and superabsorbers, metasurfaces for wavefront shaping and wave-based analog computing, as well as active, nonreciprocal and topological devices. Throughout the paper, we highlight the fundamental limitations and practical challenges associated with the realization of advanced functionalities, and we suggest potential directions to go beyond these limits. Over the next few years, as new scientific breakthroughs are translated into technological advances, the fields of metamaterials, plasmonics and nanophotonics are expected to have a broad impact on a variety of applications in areas of scientific, industrial and societal significance.
A Novel Interdisciplinary Approach to Socio-Technical Complexity
NASA Astrophysics Data System (ADS)
Bassetti, Chiara
The chapter presents a novel interdisciplinary approach that integrates micro-sociological analysis into computer-vision and pattern-recognition modeling and algorithms, the purpose being to tackle socio-technical complexity at a systemic yet micro-grounded level. The approach is empirically-grounded and both theoretically- and analytically-driven, yet systemic and multidimensional, semi-supervised and computable, and oriented towards large scale applications. The chapter describes the proposed approach especially as for its sociological foundations, and as applied to the analysis of a particular setting --i.e. sport-spectator crowds. Crowds, better defined as large gatherings, are almost ever-present in our societies, and capturing their dynamics is crucial. From social sciences to public safety management and emergency response, modeling and predicting large gatherings' presence and dynamics, thus possibly preventing critical situations and being able to properly react to them, is fundamental. This is where semi/automated technologies can make the difference. The work presented in this chapter is intended as a scientific step towards such an objective.
DataView: a computational visualisation system for multidisciplinary design and analysis
NASA Astrophysics Data System (ADS)
Wang, Chengen
2016-01-01
Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.
ERIC Educational Resources Information Center
Martins Gomes, Diogo; McCauley, Veronica
2016-01-01
Science literacy has become socially and economically very important. European countries stress that science graduates are fundamental for economic growth. Nevertheless, there is a declining student participation in science. In response, there has been a call to change the way science is taught in schools, which focuses on inquiry methods rooted…
ERIC Educational Resources Information Center
What Works Clearinghouse, 2012
2012-01-01
"Great Explorations in Math and Science[R] (GEMS[R]) Space Science" is an instructional sequence for grades 3-5 that covers fundamental concepts, including planetary sizes and distance, the Earth's shape and movement, gravity, and moon phases and eclipses. Part of the "GEMS"[R] core curriculum, "GEMS[R] Space Science"…
ERIC Educational Resources Information Center
van Aalderen-Smeets, Sandra; Walma van der Molen, Juliette
2013-01-01
In this article, we present a valid and reliable instrument which measures the attitude of in-service and pre-service primary teachers toward teaching science, called the Dimensions of Attitude Toward Science (DAS) Instrument. Attention to the attitudes of primary teachers toward teaching science is of fundamental importance to the…
The High School Biology Textbook: A Changing Mosaic of Gender, Science, and Purpose.
ERIC Educational Resources Information Center
Bianchini, Julie
How can science be made more meaningful to all students? This paper approaches this question through an analysis of gender. It begins with a brief exploration of the fundamental mismatch between women and science as described by statistics on the success, interest, and participation of women in science; feminist critiques of science; and studies…
ERIC Educational Resources Information Center
Niaz, Mansoor
2010-01-01
Kuhn (1970) considered textbooks to be good "pedagogical vehicles" for the perpetuation of "normal science". Collins (2000) has pointed out a fundamental contradiction with respect to what science could achieve (create new knowledge) and how we teach science (authoritarian). Despite the reform efforts, students still have naive views about the…
. Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Bernier, David
2011-01-01
Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
Mapping the Materials Genome through Combinatorial Informatics
NASA Astrophysics Data System (ADS)
Rajan, Krishna
2012-02-01
The recently announced White House Materials Genome Initiative provides an exciting challenge to the materials science community. To meet that challenge one needs to address a critical question, namely what is the materials genome? Some guide on how to the answer this question can be gained by recognizing that a ``gene'' is a carrier of information. In the biological sciences, discovering how to manipulate these genes has generated exciting discoveries in fundamental molecular biology as well as significant advances in biotechnology. Scaling that up to molecular, cellular length scales and beyond, has spawned from genomics, fields such as proteomics, metabolomics and essentially systems biology. The ``omics'' approach requires that one needs to discover and track these ``carriers of information'' and then correlate that information to predict behavior. A similar challenge lies in materials science, where there is a diverse array of modalities of materials ``discovery'' ranging from new materials chemistries and molecular arrangements with novel properties, to the development and design of new micro- and mesoscale structures. Hence to meaningfully adapt the spirit of ``genomics'' style research in materials science, we need to first identify and map the ``genes'' across different materials science applications On the experimental side, combinatorial experiments have opened a new approach to generate data in a high throughput manner, but without a clear way to link that to models, the full value of that data is not realized. Hence along with experimental and computational materials science, we need to add a ``third leg'' to our toolkit to make the ``Materials Genome'' a reality, the science of Materials Informatics. In this presentation we provide an overview of how information science coupled to materials science can in fact achieve the goal of mapping the ``Materials Genome''.
Reconceptualizing the Nature of Science for Science Education
NASA Astrophysics Data System (ADS)
Dagher, Zoubeida R.; Erduran, Sibel
2016-03-01
Two fundamental questions about science are relevant for science educators: (a) What is the nature of science? and (b) what aspects of nature of science should be taught and learned? They are fundamental because they pertain to how science gets to be framed as a school subject and determines what aspects of it are worthy of inclusion in school science. This conceptual article re-examines extant notions of nature of science and proposes an expanded version of the Family Resemblance Approach (FRA), originally developed by Irzik and Nola (International handbook of research in history, philosophy and science teaching. Springer, Dordrecht, pp 999-1021, 2014) in which they view science as a cognitive-epistemic and as an institutional-social system. The conceptual basis of the expanded FRA is described and justified in this article based on a detailed account published elsewhere (Erduran and Dagher in Reconceptualizing the nature of science for science education: scientific knowledge, practices and other family categories. Springer, Dordrecht, 2014a). The expanded FRA provides a useful framework for organizing science curriculum and instruction and gives rise to generative visual tools that support the implementation of a richer understanding of and about science. The practical implications for this approach have been incorporated into analysis of curriculum policy documents, curriculum implementation resources, textbook analysis and teacher education settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBTmore » shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support, Hologic, Inc.; Research Support, Barco, Inc.; Scientific Advisory Board, Gamma Medica, Inc.; Scientific Advisory Board, Real-Time Tomography, LLC.; Shareholder, Real-Time Tomography, LLC; J. Mainprize, Our lab has a research agreement with GE Healthcare on various topics in digital mammography and digital tomosynthesis; W. Zhao, Research grant from Siemens Health Care.« less
MO-DE-209-03: Assessing Image Quality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, W.
Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBTmore » shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support, Hologic, Inc.; Research Support, Barco, Inc.; Scientific Advisory Board, Gamma Medica, Inc.; Scientific Advisory Board, Real-Time Tomography, LLC.; Shareholder, Real-Time Tomography, LLC; J. Mainprize, Our lab has a research agreement with GE Healthcare on various topics in digital mammography and digital tomosynthesis; W. Zhao, Research grant from Siemens Health Care.« less
MO-DE-209-01: Primer On Tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maidment, A.
2016-06-15
Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBTmore » shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support, Hologic, Inc.; Research Support, Barco, Inc.; Scientific Advisory Board, Gamma Medica, Inc.; Scientific Advisory Board, Real-Time Tomography, LLC.; Shareholder, Real-Time Tomography, LLC; J. Mainprize, Our lab has a research agreement with GE Healthcare on various topics in digital mammography and digital tomosynthesis; W. Zhao, Research grant from Siemens Health Care.« less
MO-DE-209-04: Radiation Dosimetry in Breast Tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sechopoulos, I.
2016-06-15
Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBTmore » shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support, Hologic, Inc.; Research Support, Barco, Inc.; Scientific Advisory Board, Gamma Medica, Inc.; Scientific Advisory Board, Real-Time Tomography, LLC.; Shareholder, Real-Time Tomography, LLC; J. Mainprize, Our lab has a research agreement with GE Healthcare on various topics in digital mammography and digital tomosynthesis; W. Zhao, Research grant from Siemens Health Care.« less
MO-DE-209-02: Tomosynthesis Reconstruction Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mainprize, J.
2016-06-15
Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBTmore » shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support, Hologic, Inc.; Research Support, Barco, Inc.; Scientific Advisory Board, Gamma Medica, Inc.; Scientific Advisory Board, Real-Time Tomography, LLC.; Shareholder, Real-Time Tomography, LLC; J. Mainprize, Our lab has a research agreement with GE Healthcare on various topics in digital mammography and digital tomosynthesis; W. Zhao, Research grant from Siemens Health Care.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
NASA Astrophysics Data System (ADS)
Tang, William M., Dr.
2006-01-01
The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.
NASA Astrophysics Data System (ADS)
Koch, Melissa; Gorges, Torie
2016-10-01
Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.
The NASA computer science research program plan
NASA Technical Reports Server (NTRS)
1983-01-01
A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
George E. Pake Prize Lecture: Physical Sciences Research at IBM: Still at the Cutting Edge
NASA Astrophysics Data System (ADS)
Theis, Thomas
2015-03-01
The information technology revolution is in its ``build out'' phase. The foundational scientific insights and hardware inventions are now many decades old. The microelectronics industry is maturing. An increasing fraction of the total research investment is in software and services, as applications of information technology transform every business and every sector of the public and private economy. Yet IBM Research continues to make substantial investments in hardware technology and the underlying physical sciences. While some of this investment is aimed at extending the established transistor technology, an increasing fraction is aimed at longer-term and possibly disruptive research - new devices for computing, such as tunneling field-effect transistors and nanophotonic circuits, and new architectures, such as neurosynaptic systems and quantum computing. This research investment is a bet that the old foundations of information technology are ripe for reinvention. After all, today's information technology devices and systems operate far from any fundamental limits on speed and energy efficiency. But how can IBM make risky long-term research investments in an era of global competition, with financial markets focused on the short term? One important answer is partnerships. Since its early days, IBM Research has pursued innovation in information technology and innovation in the ways it conducts the business of research. By continuously evolving new models for research and development partnerships, it has extended its global reach, increased its impact on IBM's customers, and expanded the breadth and depth of its research project portfolio. Research in the physical sciences has often led the way. Currently on assignment to the Semiconductor Research Corporation.
1988-07-08
Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36
Empirical Determination of Competence Areas to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia
2014-01-01
The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…
Factors Influencing Exemplary Science Teachers' Levels of Computer Use
ERIC Educational Resources Information Center
Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen
2011-01-01
The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
NASA Astrophysics Data System (ADS)
Carlowicz, Michael
If you have a computer and a grasp of algebra, you can learn physics. That is one of the messages behind the release of Physics—The Root Science, a new full-text version of a physics textbook available at no cost on the World Wide Web.The interactive textbook is the work of the International Institute of Theoretical and Applied Physics (IITAP) at Iowa State University, which was established in 1993 as a partnership with the United Nations Education, Scientific, and Cultural Organization (UNESCO). With subject matter equivalent to that of a 400-page volume, the text is designed to be completed in one school year. The textbook also will eventually include video clips of experiments and interactive learning modules, as well as links to appropriate cross-references about fundamental principles of physics.
Quantification of network structural dissimilarities.
Schieber, Tiago A; Carpi, Laura; Díaz-Guilera, Albert; Pardalos, Panos M; Masoller, Cristina; Ravetti, Martín G
2017-01-09
Identifying and quantifying dissimilarities among graphs is a fundamental and challenging problem of practical importance in many fields of science. Current methods of network comparison are limited to extract only partial information or are computationally very demanding. Here we propose an efficient and precise measure for network comparison, which is based on quantifying differences among distance probability distributions extracted from the networks. Extensive experiments on synthetic and real-world networks show that this measure returns non-zero values only when the graphs are non-isomorphic. Most importantly, the measure proposed here can identify and quantify structural topological differences that have a practical impact on the information flow through the network, such as the presence or absence of critical links that connect or disconnect connected components.
Computational fluid dynamics - An introduction for engineers
NASA Astrophysics Data System (ADS)
Abbott, Michael Barry; Basco, David R.
An introduction to the fundamentals of CFD for engineers and physical scientists is presented. The principal definitions, basic ideas, and most common methods used in CFD are presented, and the application of these methods to the description of free surface, unsteady, and turbulent flow is shown. Emphasis is on the numerical treatment of incompressible unsteady fluid flow with primary applications to water problems using the finite difference method. While traditional areas of application like hydrology, hydraulic and coastal engineering and oceanography get the main emphasis, newer areas of application such as medical fluid dynamics, bioengineering, and soil physics and chemistry are also addressed. The possibilities and limitations of CFD are pointed out along with the relations of CFD to other branches of science.
Generation of multiphoton entangled quantum states by means of integrated frequency combs.
Reimer, Christian; Kues, Michael; Roztocki, Piotr; Wetzel, Benjamin; Grazioso, Fabio; Little, Brent E; Chu, Sai T; Johnston, Tudor; Bromberg, Yaron; Caspani, Lucia; Moss, David J; Morandotti, Roberto
2016-03-11
Complex optical photon states with entanglement shared among several modes are critical to improving our fundamental understanding of quantum mechanics and have applications for quantum information processing, imaging, and microscopy. We demonstrate that optical integrated Kerr frequency combs can be used to generate several bi- and multiphoton entangled qubits, with direct applications for quantum communication and computation. Our method is compatible with contemporary fiber and quantum memory infrastructures and with chip-scale semiconductor technology, enabling compact, low-cost, and scalable implementations. The exploitation of integrated Kerr frequency combs, with their ability to generate multiple, customizable, and complex quantum states, can provide a scalable, practical, and compact platform for quantum technologies. Copyright © 2016, American Association for the Advancement of Science.
Multifrequency AFM: from origins to convergence.
Santos, Sergio; Lai, Chia-Yun; Olukan, Tuza; Chiesa, Matteo
2017-04-20
Since the inception of the atomic force microscope (AFM) in 1986, influential papers have been presented by the community and tremendous advances have been reported. Being able to routinely image conductive and non-conductive surfaces in air, liquid and vacuum environments with nanoscale, and sometimes atomic, resolution, the AFM has long been perceived by many as the instrument to unlock the nanoscale. From exploiting a basic form of Hooke's law to interpret AFM data to interpreting a seeming zoo of maps in the more advanced multifrequency methods however, an inflection point has been reached. Here, we discuss this evolution, from the fundamental dilemmas that arose in the beginning, to the exploitation of computer sciences, from machine learning to big data, hoping to guide the newcomer and inspire the experimenter.
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
The Concept of Ideology in Analysis of Fundamental Questions in Science Education
NASA Astrophysics Data System (ADS)
Säther, Jostein
The use of the concept of `ideology' in interpretation of science education curricula, textbooks and various practises is reviewed, and examples are given by referring to Norwegian curricula and textbooks. The term is proposed to be used in a broad sense about any kind of action-oriented theory based on a system of ideas, or any attempt to approach politics in the light of a system of ideas. Politics in this context is about shaping of education, and is related to forces (i.e., hypothetical impacts of idea systems) which may legitimise, change, or criticise social practices. The focus is (although not in every case) on the hidden, unconscious and critical aspects. The notion ideological aspects is proposed to be related to metaphysical-ontological, epistemological and axiological claims and connotations. Examples of educational issues concerning e.g., aims, compartmentalisation, integration, and fundamentally different ideas about truth, learning and man are mentioned. Searching for a single and unifying concept for the discussing of all of science education's fundamental questions seems however in vain. Therefore a wide range of concepts seems necessary to deepen our understanding of ``the fundamental questions''.
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender
NASA Astrophysics Data System (ADS)
Larsen, Elizabeth A.; Stubbs, Margaret L.
Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.
Democratizing Computer Science
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Ryoo, Jean J.
2015-01-01
Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…
ERIC Educational Resources Information Center
Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu
2013-01-01
With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…
Computational structural mechanics engine structures computational simulator
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1989-01-01
The Computational Structural Mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures.
Primary Teachers' Attitudes toward Science: A New Theoretical Framework
ERIC Educational Resources Information Center
van Aalderen-Smeets, Sandra I.; Walma van der Molen, Juliette H.; Asma, Lieke J. F.
2012-01-01
Attention to the attitudes of preservice and inservice primary teachers toward science is of fundamental importance to research on primary science education. However, progress in this field of research has been slow due to the poor definition and conceptualization of the construct of primary teachers' attitude toward science. This poor theoretical…
Tourist-Centric Citizen Science in Denali National Park and Preserve
ERIC Educational Resources Information Center
Fischer, Heather A.
2017-01-01
Citizen Science programs create a bi-directional flow of knowledge between scientists and citizen volunteers; this flow democratizes science in order to create an informed public (Bonney et al. 2014; Brown, Kelly, and Whitall 2014). This democratization is a fundamental part of creating a science that can address today's pressing environmental,…
ERIC Educational Resources Information Center
Örnek, Funda; Turkey, Kocaeli
2014-01-01
Current approaches in Science Education attempt to enable students to develop an understanding of the nature of science, develop fundamental scientific concepts, and develop the ability to structure, analyze, reason, and communicate effectively. Students pose, solve, and interpret scientific problems, and eventually set goals and regulate their…
Proceedings of the Workshop on the Scientific Applications of Clocks in Space
NASA Technical Reports Server (NTRS)
Maleki, Lute (Editor)
1997-01-01
The Workshop on Scientific Applications of Clocks in space was held to bring together scientists and technologists interested in applications of ultrastable clocks for test of fundamental theories, and for other science investigations. Time and frequency are the most precisely determined of all physical parameters, and thus are the required tools for performing the most sensitive tests of physical theories. Space affords the opportunity to make measurement, parameters inaccessible on Earth, and enables some of the most original and sensitive tests of fundamental theories. In the past few years, new developments in clock technologies have pointed to the opportunity for flying ultrastable clocks in support of science investigations of space missions. This development coincides with the new NASA paradigm for space flights, which relies on frequent, low-cost missions in place of the traditional infrequent and high-cost missions. The heightened interest in clocks in space is further advanced by new theoretical developments in various fields. For example, recent developments in certain Grand Unified Theory formalisms have vastly increased interest in fundamental tests of gravitation physics with clocks. The workshop included sessions on all related science including relativity and gravitational physics, cosmology, orbital dynamics, radio science, geodynamics, and GPS science and others, as well as a session on advanced clock technology.
Computer Science and the Liberal Arts
ERIC Educational Resources Information Center
Shannon, Christine
2010-01-01
Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…
Marrying Content and Process in Computer Science Education
ERIC Educational Resources Information Center
Zendler, A.; Spannagel, C.; Klaudt, D.
2011-01-01
Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…
ERIC Educational Resources Information Center
Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.
2016-01-01
Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…
Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University
ERIC Educational Resources Information Center
Plane, Jandelyn
2010-01-01
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…
Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.
ERIC Educational Resources Information Center
Turner, Judith Axler
1987-01-01
Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)
African-American males in computer science---Examining the pipeline for clogs
NASA Astrophysics Data System (ADS)
Stone, Daryl Bryant
The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.
The Promise of Exposure Science
Exposure science is the bedrock for protection of public health. It fundamentally informs decisions that relate to smart and sustainable design, prevention and mitigation of adverse exposures, and ultimately health protection.
Reconfigurability in MDO Problem Synthesis. Part 1
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia M.; Lewis, Robert Michael
2004-01-01
Integrating autonomous disciplines into a problem amenable to solution presents a major challenge in realistic multidisciplinary design optimization (MDO). We propose a linguistic approach to MDO problem description, formulation, and solution we call reconfigurable multidisciplinary synthesis (REMS). With assistance from computer science techniques, REMS comprises an abstract language and a collection of processes that provide a means for dynamic reasoning about MDO problems in a range of contexts. The approach may be summarized as follows. Description of disciplinary data according to the rules of a grammar, followed by lexical analysis and compilation, yields basic computational components that can be assembled into various MDO problem formulations and solution algorithms, including hybrid strategies, with relative ease. The ability to re-use the computational components is due to the special structure of the MDO problem. The range of contexts for reasoning about MDO spans tasks from error checking and derivative computation to formulation and reformulation of optimization problem statements. In highly structured contexts, reconfigurability can mean a straightforward transformation among problem formulations with a single operation. We hope that REMS will enable experimentation with a variety of problem formulations in research environments, assist in the assembly of MDO test problems, and serve as a pre-processor in computational frameworks in production environments. This paper, Part 1 of two companion papers, discusses the fundamentals of REMS. Part 2 illustrates the methodology in more detail.
Reconfigurability in MDO Problem Synthesis. Part 2
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia M.; Lewis, Robert Michael
2004-01-01
Integrating autonomous disciplines into a problem amenable to solution presents a major challenge in realistic multidisciplinary design optimization (MDO). We propose a linguistic approach to MDO problem description, formulation, and solution we call reconfigurable multidisciplinary synthesis (REMS). With assistance from computer science techniques, REMS comprises an abstract language and a collection of processes that provide a means for dynamic reasoning about MDO problems in a range of contexts. The approach may be summarized as follows. Description of disciplinary data according to the rules of a grammar, followed by lexical analysis and compilation, yields basic computational components that can be assembled into various MDO problem formulations and solution algorithms, including hybrid strategies, with relative ease. The ability to re-use the computational components is due to the special structure of the MDO problem. The range of contexts for reasoning about MDO spans tasks from error checking and derivative computation to formulation and reformulation of optimization problem statements. In highly structured contexts, reconfigurability can mean a straightforward transformation among problem formulations with a single operation. We hope that REMS will enable experimentation with a variety of problem formulations in research environments, assist in the assembly of MDO test problems, and serve as a pre-processor in computational frameworks in production environments. Part 1 of two companion papers, discusses the fundamentals of REMS. This paper, Part 2 illustrates the methodology in more detail.
Finite Dimensional Approximations for Continuum Multiscale Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berlyand, Leonid
2017-01-24
The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
Michael J. Furniss; Catherine F. Clifton; Kathryn L. Ronnenberg
2007-01-01
This conference was attended by nearly 450 Forest Service earth scientists representing hydrology, soil science, geology, and air. In addition to active members of the earth science professions, many retired scientists also attended and participated. These 60 peer-reviewed papers represent a wide spectrum of earth science investigation, experience, research, and...
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
NASA Astrophysics Data System (ADS)
Troy, R. M.
2005-12-01
With ever increasing amounts of Earth-Science funding being diverted to the war in Iraq, the Earth-Science community must now more than ever wring every bit of utility out of every dollar. We're not likely to get funded any projects perceived by others as "pie in the sky", so we have to look at already funded programs within our community and directing new programs in a unifying direction. We have not yet begun the transition to a computationally unifying, general-purpose Earth Science computing paradigm, though it was proposed at the Fall 2002 AGU meeting in San Francisco, and perhaps earlier. Encouragingly, we do see a recognition that more commonality is needed as various projects have as funded goals the addition of the processing and dissemination of new datatypes, or data-sets, if you prefer, to their existing repertoires. Unfortunately, the timelines projected for adding a datatype to an existing system are typically estimated at around two years each. Further, many organizations have the perception that they can only use their dollars to support exclusively their own needs as they don't have the money to support the goals of others, thus overlooking opportunities to satisfy their own needs while at the same time aiding the creation of a global GeoScience cyber-infrastructure. While Computational Unification appears to be an unfunded, impossible dream, at least for now, individual projects can take steps that are compatible with a unified community and can help build one over time. This session explores these opportunities. The author will discuss the issues surrounding this topic, outlining alternative perspectives on the points of difficulty, and proposing straight-forward solutions which every Earth Science data processing system should consider. Sub-topics include distributed meta-data, distributed processing, distributed data objects, interdisciplinary concerns, and scientific defensibility with an overall emphasis on how previously written processes and functions may be integrated into a system efficiently, with minimal effort, and with an eye toward an eventual Computational Unification of the Earth Sciences. A fundamental to such systems is meta-data which describe not only the content of data but also how intricate relationships are represented and used to good advantage. Retrieval techniques will be discussed including trade-offs in using externally managed meta-data versus embedded meta-data, how the two may be integrated, and how "simplifying assumptions" may or may not actually be helpful. The perspectives presented in this talk or poster session are based upon the experience of the Sequoia 2000 and BigSur research projects at the University of California, Berkeley, which sought to unify NASA's Mission To Planet Earth's EOS-DIS, and on-going experience developed by Science Tools corporation, of which the author is a principal. NOTE: These ideas are most easily shared in the form of a talk, and we suspect that this session will generate a lot of interest. We would therefore prefer to have this session accepted as a talk as opposed to a poster session.
Computer Science and Telecommunications Board summary of activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blumenthal, M.S.
1992-03-27
The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.
Hispanic women overcoming deterrents to computer science: A phenomenological study
NASA Astrophysics Data System (ADS)
Herling, Lourdes
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.
Research in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
ERIC Educational Resources Information Center
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-01-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…
Rigorous derivation of porous-media phase-field equations
NASA Astrophysics Data System (ADS)
Schmuck, Markus; Kalliadasis, Serafim
2017-11-01
The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.
Efficient computation of optimal actions.
Todorov, Emanuel
2009-07-14
Optimal choice of actions is a fundamental problem relevant to fields as diverse as neuroscience, psychology, economics, computer science, and control engineering. Despite this broad relevance the abstract setting is similar: we have an agent choosing actions over time, an uncertain dynamical system whose state is affected by those actions, and a performance criterion that the agent seeks to optimize. Solving problems of this kind remains hard, in part, because of overly generic formulations. Here, we propose a more structured formulation that greatly simplifies the construction of optimal control laws in both discrete and continuous domains. An exhaustive search over actions is avoided and the problem becomes linear. This yields algorithms that outperform Dynamic Programming and Reinforcement Learning, and thereby solve traditional problems more efficiently. Our framework also enables computations that were not possible before: composing optimal control laws by mixing primitives, applying deterministic methods to stochastic systems, quantifying the benefits of error tolerance, and inferring goals from behavioral data via convex optimization. Development of a general class of easily solvable problems tends to accelerate progress--as linear systems theory has done, for example. Our framework may have similar impact in fields where optimal choice of actions is relevant.
Multilevel UQ strategies for large-scale multiphysics applications: PSAAP II solar receiver
NASA Astrophysics Data System (ADS)
Jofre, Lluis; Geraci, Gianluca; Iaccarino, Gianluca
2017-06-01
Uncertainty quantification (UQ) plays a fundamental part in building confidence in predictive science. Of particular interest is the case of modeling and simulating engineering applications where, due to the inherent complexity, many uncertainties naturally arise, e.g. domain geometry, operating conditions, errors induced by modeling assumptions, etc. In this regard, one of the pacing items, especially in high-fidelity computational fluid dynamics (CFD) simulations, is the large amount of computing resources typically required to propagate incertitude through the models. Upcoming exascale supercomputers will significantly increase the available computational power. However, UQ approaches cannot entrust their applicability only on brute force Monte Carlo (MC) sampling; the large number of uncertainty sources and the presence of nonlinearities in the solution will make straightforward MC analysis unaffordable. Therefore, this work explores the multilevel MC strategy, and its extension to multi-fidelity and time convergence, to accelerate the estimation of the effect of uncertainties. The approach is described in detail, and its performance demonstrated on a radiated turbulent particle-laden flow case relevant to solar energy receivers (PSAAP II: Particle-laden turbulence in a radiation environment). Investigation funded by DoE's NNSA under PSAAP II.
NASA Astrophysics Data System (ADS)
Yang, Zhi-wei; Hao, Dong-xiao; Che, Yi-zhuo; Yang, Jia-hui; Zhang, Lei; Zhang, Sheng-li
2018-01-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11374237 and 11504287), Fundamental Research Funds for the Central Universities, China, China Postdoctoral Science Foundation (Grant No. 2017M613147), and Shaanxi Province Postdoctoral Science Foundation, China.
ERIC Educational Resources Information Center
Heisenberg, Werner
1973-01-01
Discusses the influence of tradition in science on selection of scientific problems and methods and on the use of concepts as tools for research work. Indicates that future research studies will be directed toward the change of fundamental concepts in such fields as astrophysics, molecular biology, and environmental science. (CC)
Opportunities for Computational Discovery in Basic Energy Sciences
NASA Astrophysics Data System (ADS)
Pederson, Mark
2011-03-01
An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~
Research | Computational Science | NREL
Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples
An integration of integrated information theory with fundamental physics
Barrett, Adam B.
2014-01-01
To truly eliminate Cartesian ghosts from the science of consciousness, we must describe consciousness as an aspect of the physical. Integrated Information Theory states that consciousness arises from intrinsic information generated by dynamical systems; however existing formulations of this theory are not applicable to standard models of fundamental physical entities. Modern physics has shown that fields are fundamental entities, and in particular that the electromagnetic field is fundamental. Here I hypothesize that consciousness arises from information intrinsic to fundamental fields. This hypothesis unites fundamental physics with what we know empirically about the neuroscience underlying consciousness, and it bypasses the need to consider quantum effects. PMID:24550877
ERIC Educational Resources Information Center
van Eijck, Michiel; Roth, Wolff-Michael
2009-01-01
Bringing a greater number of students into science is one of, if not the most fundamental goals of science education for "all", especially for heretofore-neglected groups of society such as women and Aboriginal students. Providing students with opportunities to experience how science really is enacted--i.e., "authentic science"--has been advocated…
NASA Astrophysics Data System (ADS)
Chu, X.
2011-12-01
This study, funded by the NSF CAREER program, focuses on developing new methods to quantify microtopography-controlled overland flow processes and integrating the cutting-edge hydrologic research with all-level education and outreach activities. To achieve the educational goal, an interactive teaching-learning software package has been developed. This software, with enhanced visualization capabilities, integrates the new modeling techniques, computer-guided learning processes, and education-oriented tools in a user-friendly interface. Both Windows-based and web-based versions have been developed. The software is specially designed for three major user levels: elementary level (Level 1: K-12 and outreach education), medium level (Level 2: undergraduate education), and advanced level (Level 3: graduate education). Depending on the levels, users are guided to different educational systems. Each system consists of a series of mini "libraries" featured with movies, pictures, and documentation that cover fundamental theories, varying scale experiments, and computer modeling of overland flow generation, surface runoff, and infiltration processes. Testing and practical use of this educational software in undergraduate and graduate teaching demonstrate its effectiveness to promote students' learning and interest in hydrologic sciences. This educational software also has been used as a hydrologic demonstration tool for K-12 students and Native American students through the Nurturing American Tribal Undergraduate Research Education (NATURE) program and Science, Technology, Engineering and Mathematics (STEM) outreach activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dincă, Mircea; Léonard, François
Metal–organic frameworks (MOFs), with their crystalline nanoporous three-dimensional structures, have emerged as unique multifunctional materials that combine high porosity with catalytic, photophysical, or other properties to reveal new fundamental science and applications. Because MOFs are composed of organic molecules linking metal centers in ways that are not usually conducive to the formation of free-charge carriers or low-energy charge-transport pathways, they are typically insulators. Accordingly, applications so far have harnessed the unique structural properties and porosity of MOFs, which depend only to a small extent on the ability to manipulate their electronic structure. An exciting new area has emerged due tomore » the recent demonstration of MOFs with controlled electronic and optical properties, which is enabling new fundamental science and opens up the possibility of applications in electronics and photonics. This article presents an overview of the fundamental science issues related to controlling electronic and optical properties of MOFs, and how research groups worldwide have been exploring such properties for electronics, thermoelectrics, photophysics, and charge storage.« less
Nearfield acoustic holography. I - Theory of generalized holography and the development of NAH
NASA Technical Reports Server (NTRS)
Maynard, J. D.; Williams, E. G.; Lee, Y.
1985-01-01
Because its underlying principles are so fundamental, holography has been studied and applied in many areas of science. Recently, a technique has been developed which takes the maximum advantage of the fundamental principles and extracts much more information from a hologram than is customarily associated with such a measurement. In this paper the fundamental principles of holography are reviewed, and a sound radiation measurement system, called nearfield acoustic holography (NAH), which fully exploits the fundamental principles, is described.
Research Trend of Physical Skill Science --Towards Elucidation of Physical Skill--
NASA Astrophysics Data System (ADS)
Furukawa, Koichi; Ueno, Ken; Ozaki, Tomonobu; Kamisato, Shihoko; Kawamoto, Ryuji; Shibuya, Koji; Shiratori, Naruhiko; Suwa, Masaki; Soga, Masato; Taki, Hirokazu; Fujinami, Tsutomu; Hori, Satoshi; Motomura, Yoichi; Morita, Souhei
Physical skills and language skills are both fundamental intelligent abilities of human being. In this paper, we focus our attention to such sophisticated physical skills as playing sports and playing instruments and introduce research activities aiming at elucidating and verbalizing them. This research area has been launched recently. We introduce approaches from physical modeling, measurements and data analysis, cognitive science and human interface. We also discuss such issues as skill acquisition and its support systems. Furthermore, we consider a fundamental issue of individual differences occurring in every application of skill elucidation. Finally we introduce several attempts of skill elucidation in the fields of dancing, manufacturing, playing string instruments, sports science and medical care.
NASA Astrophysics Data System (ADS)
Chen, Jean Chi-Jen
Physics is fundamental for science, engineering, medicine, and for understanding many phenomena encountered in people's daily lives. The purpose of this study was to investigate the relationships between student success in college-level introductory physics courses and various educational and background characteristics. The primary variables of this study were gender, high school mathematics and science preparation, preference and perceptions of learning physics, and performance in introductory physics courses. Demographic characteristics considered were age, student grade level, parents' occupation and level of education, high school senior grade point average, and educational goals. A Survey of Learning Preference and Perceptions was developed to collect the information for this study. A total of 267 subjects enrolled in six introductory physics courses, four algebra-based and two calculus-based, participated in the study conducted during Spring Semester 2002. The findings from the algebra-based physics courses indicated that participant's educational goal, high school senior GPA, father's educational level, mother's educational level, and mother's occupation in the area of science, engineering, or computer technology were positively related to performance while participant age was negatively related. Biology preparation, mathematics preparation, and additional mathematics and science preparation in high school were also positively related to performance. The relationships between the primary variables and performance in calculus-based physics courses were limited to high school senior year GPA and high school physics preparation. Findings from all six courses indicated that participant's educational goal, high school senior GPA, father's educational level, and mother's occupation in the area of science, engineering, or computer technology, high school preparation in mathematics, biology, and the completion of additional mathematics and science courses were positively related to performance. No significant performance differences were found between male and female students. However, there were significant gender differences in physics learning perceptions. Female participants tended to try to understand physics materials and relate the physics problems to real world situations while their male counterparts tended to rely on rote learning and equation application. This study found that participants performed better by trying to understand the physics material and relate physics problems to real world situations. Participants who relied on rote learning did not perform well.
Trends in Practical Work in German Science Education
ERIC Educational Resources Information Center
di Fuccia, David; Witteck, Torsten; Markic, Silvija; Eilks, Ingo
2012-01-01
By the 1970s a fundamental shift had taken place in German science education. This was a shift away from the learning of more-or-less isolated facts and facets in Biology, Chemistry, and Physics towards a restructuring of science teaching along the general principles of the respective science domains. The changes included also the addition of…
The Impact of Project 2061 on Science Education in Northeastern Louisiana Classrooms.
ERIC Educational Resources Information Center
Webb, Paula Bauer; Pugh, Ava F.
Project 2061, a broad-based science reform movement, was launched by the American Association for the Advancement of Science, the Carnegie Corporation of New York, and the Andrew W. Mellon Foundation to define the fundamental science and mathematics American students should know. A second phase of Project 2061 translated the defined learning goals…
Still More Science Activities. 20 Exciting Activities To Do!
ERIC Educational Resources Information Center
Smithsonian Institution, Washington, DC.
Science and technology affect every facet of human life. By the 21st century, society will demand that all of its citizens possess basic competencies in the fundamentals of science and the use of technology. As science increasingly becomes the dominant subject of the work place, it is important to begin developing within children an understanding…
ERIC Educational Resources Information Center
Boyle, Timothy J.; Sears, Jeremiah M.; Hernandez-Sanchez, Bernadette A.; Casillas, Maddison R.; Nguyen, Thao H.
2017-01-01
The Chemistry Science Investigation: Dognapping Workshop was designed to (i) target and inspire fourth grade students to view themselves as "Junior Scientists" before their career decisions are solidified; (ii) enable hands-on experience in fundamental scientific concepts; (iii) increase public interaction with science, technology,…
ERIC Educational Resources Information Center
Scott, Philip H.; Mortimer, Eduardo F.; Aguiar, Orlando G.
2006-01-01
In this paper, we draw upon a framework for analyzing the discursive interactions of science classrooms (Mortimer & Scott, 2003, "Meaning Making in Secondary Science Classrooms," Maidenhead, UK: Open University Press), to probe the movement between authoritative and dialogic discourse in a Brazilian high school science class. More…
The Development of the Nature of Science View Scale (NOSvs) at University Level
ERIC Educational Resources Information Center
Temel, Senar; Sen, Senol; Özcan, Özgür
2018-01-01
Background: Determining individuals' views of the nature of science is quite important for researchers since it is both a component of scientific literacy and a fundamental aim of science education. Purpose: This study aims to develop a NOSvs for assessing prospective teachers' views of the nature of science and to analyse their psychometric…
Bourdieu and Science Studies: Toward a Reflexive Sociology
ERIC Educational Resources Information Center
Hess, David J.
2011-01-01
Two of Bourdieu's fundamental contributions to science studies--the reflexive analysis of the social and human sciences and the concept of an intellectual field--are used to frame a reflexive study of the history and social studies of science and technology as an intellectual field in the United States. The universe of large, Ph.D.-granting…
More Science Activities. 20 Exciting Experiments To Do!
ERIC Educational Resources Information Center
Smithsonian Institution, Washington, DC.
Science and technology affect every facet of human life. By the 21st century, society will demand that all of its citizens possess basic competencies in the fundamentals of science and the use of technology. As science increasingly becomes the dominant subject of the work place, it is important to begin developing within children an understanding…
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Chemistry and materials science progress report, FY 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-07-01
Research is reported in the areas of surface science, fundamentals of the physics and processing of metals, energetic materials, transactinide materials and properties and other indirectly related areas of weapons research.
Girls Save the World through Computer Science
ERIC Educational Resources Information Center
Murakami, Christine
2011-01-01
It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…
ERIC Educational Resources Information Center
Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2015-01-01
The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…
Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study
ERIC Educational Resources Information Center
Herling, Lourdes
2011-01-01
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…
ERIC Educational Resources Information Center
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-01-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…
Quality knowledge of science through virtual laboratory as an element of visualization
NASA Astrophysics Data System (ADS)
Rizman Herga, Natasa
Doctoral dissertation discusses the use of virtual laboratory for learning and teaching chemical concepts at science classes in the seventh grade of primary school. The dissertation has got a two-part structure. In the first theoretical part presents a general platform of teaching science in elementary school, teaching forms and methods of teaching and among modern approaches we highlight experimental work. Particular emphasis was placed on the use of new technologies in education and virtual laboratories. Scientific findings on the importance of visualization of science concepts and their triple nature of their understanding are presented. These findings represent a fundamental foundation of empirical research presented in the second part of the doctoral dissertation, whose basic purpose was to examine the effectiveness of using virtual laboratory for teaching and learning chemical contents at science from students' point of view on knowledge and interest. We designed a didactic experiment in which 225 pupils participated. The work was conducted in the experimental and control group. Prior to its execution, the existing school practice among science and chemistry teachers was analysed in terms of: (1) inclusion of experimental work as a fundamental method of active learning chemical contents, (2) the use of visualization methods in the classroom and (3) the use of a virtual laboratory. The main findings of the empirical research, carried out in the school year 2012/2013, in which 48 science and chemistry participated, are that teachers often include experimental work when teaching chemical contents. Interviewed science teachers use a variety of visualization methods when presenting science concepts, in particular computer animation and simulation. Using virtual laboratory as a new strategy for teaching and learning chemical contents is not common because teachers lack special-didactic skills, enabling them to use virtual reality technology. Based on the didactic experiment, carried out over a period of two school years (2012/2013 and 2013/2014) in ten primary schools, the effectiveness of teaching carried out with the support of a virtual laboratory was analyzed. The obtained empirical findings reveal that the use of virtual laboratory has great impact on the pupils' knowledge and interest. At the end of the experiment, pupils in the experimental group had an advantage according to knowledge of chemical contents in science. Also, the use of virtual laboratory had an impact on the sustainability of the acquired knowledge of science contents and pupils' interest at the end of the experiment, because the pupils in the experimental group had a higher interest for learning science contents. The didactic experiment determined, that the use of virtual laboratory enables quality learning and teaching chemical contents of science, because it allows: (1) experimental work as an active learning method, (2) the visualization of abstract concepts and phenomena, (3) dynamic sub micro presentations (4) integration of all three levels of the chemical concept as a whole and (5) positively impacts pupils' interest, knowledge and sustainability of the acquired knowledge.
Physical Science Day: Design, Implementation, and Assessment
ERIC Educational Resources Information Center
Zeng, Liang; Cunningham, Mark A.; Tidrow, Steven C.; Smith, K. Christopher; Contreras, Jerry
2016-01-01
Physical Science Day at The University of Texas--Pan American (UTPA), in collaboration with the Edinburg Consolidated Independent School District, has been designed, developed and implemented to address an identified fundamental shortcoming in our educational process within this primarily (90+%) Hispanic serving border region. Physical Science Day…
Stationary Engineering. Science Manual--2.
ERIC Educational Resources Information Center
Frost, Harold J.; Steingress, Frederick M.
This second-year student manual contains 140 brief related science lessons applying science and math to trade activities in the field of stationary engineering. The lessons are organized into 16 units: (1) Introduction to Stationary Engineering, (2) Engineering Fundamentals, (3) Steam Boilers, (4) Boiler Fittings, (5) Boilerroom System, (6)…
The emergence of time's arrows and special science laws from physics
Loewer, Barry
2012-01-01
In this paper, I will argue that there is an important connection between two questions concerning how certain features of the macro world emerge from the laws and processes of fundamental microphysics and suggest an approach to answering these questions. The approach involves a kind of emergence but quite different from ‘top-down’ emergence discussed at the conference, for which an earlier version of this paper was written. The two questions are (i) How do ‘the arrows of time’ emerge from microphysics? (ii) How do macroscopic special science laws and causation emerge from microphysics? Answering these questions is especially urgent for those, who like myself, think that a certain version of physicalism, which I call ‘micro-physical completeness’ (MC), is true. According to MC, there are fundamental dynamical laws that completely govern (deterministically or probabilistically), the evolution of all micro-physical events and there are no additional ontologically independent dynamical or causal special science laws. In other words, there is no ontologically independent ‘top-down’ causation. Of course, MC does not imply that physicists now or ever will know or propose the complete laws of physics. Or even if the complete laws were known we would know how special science properties and laws reduce to laws and properties of fundamental physics. Rather, MC is a contingent metaphysical claim about the laws of our world. After a discussion of the two questions, I will argue the key to showing how it is possible for the arrows of time and the special science laws to emerge from microphysics and a certain account of how thermodynamics is related to fundamental dynamical laws. PMID:23386956
The emergence of time's arrows and special science laws from physics.
Loewer, Barry
2012-02-06
In this paper, I will argue that there is an important connection between two questions concerning how certain features of the macro world emerge from the laws and processes of fundamental microphysics and suggest an approach to answering these questions. The approach involves a kind of emergence but quite different from 'top-down' emergence discussed at the conference, for which an earlier version of this paper was written. The two questions are (i) How do 'the arrows of time' emerge from microphysics? (ii) How do macroscopic special science laws and causation emerge from microphysics? Answering these questions is especially urgent for those, who like myself, think that a certain version of physicalism, which I call 'micro-physical completeness' (MC), is true. According to MC, there are fundamental dynamical laws that completely govern (deterministically or probabilistically), the evolution of all micro-physical events and there are no additional ontologically independent dynamical or causal special science laws. In other words, there is no ontologically independent 'top-down' causation. Of course, MC does not imply that physicists now or ever will know or propose the complete laws of physics. Or even if the complete laws were known we would know how special science properties and laws reduce to laws and properties of fundamental physics. Rather, MC is a contingent metaphysical claim about the laws of our world. After a discussion of the two questions, I will argue the key to showing how it is possible for the arrows of time and the special science laws to emerge from microphysics and a certain account of how thermodynamics is related to fundamental dynamical laws.
Computer Literacy Project. A General Orientation in Basic Computer Concepts and Applications.
ERIC Educational Resources Information Center
Murray, David R.
This paper proposes a two-part, basic computer literacy program for university faculty, staff, and students with no prior exposure to computers. The program described would introduce basic computer concepts and computing center service programs and resources; provide fundamental preparation for other computer courses; and orient faculty towards…
Non-Determinism: An Abstract Concept in Computer Science Studies
ERIC Educational Resources Information Center
Armoni, Michal; Gal-Ezer, Judith
2007-01-01
Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…
An Investigation of Primary School Science Teachers' Use of Computer Applications
ERIC Educational Resources Information Center
Ocak, Mehmet Akif; Akdemir, Omur
2008-01-01
This study investigated the level and frequency of science teachers' use of computer applications as an instructional tool in the classroom. The manner and frequency of science teachers' use of computer, their perceptions about integration of computer applications, and other factors contributed to changes in their computer literacy are…
Methodical Approaches to Teaching of Computer Modeling in Computer Science Course
ERIC Educational Resources Information Center
Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina
2015-01-01
The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…
NASA Astrophysics Data System (ADS)
Richard, G. A.
2003-12-01
Major research facilities and organizations provide an effective venue for developing partnerships with educational organizations in order to offer a wide variety of educational programs, because they constitute a base where the culture of scientific investigation can flourish. The Consortium for Materials Properties Research in Earth Sciences (COMPRES) conducts education and outreach programs through the Earth Science Educational Resource Center (ESERC), in partnership with other groups that offer research and education programs. ESERC initiated its development of education programs in 1994 under the administration of the Center for High Pressure Research (CHiPR), which was funded as a National Science Foundation Science and Technology Center from 1991 to 2002. Programs developed during ESERC's association with CHiPR and COMPRES have targeted a wide range of audiences, including pre-K, K-12 students and teachers, undergraduates, and graduate students. Since 1995, ESERC has offered inquiry-based programs to Project WISE (Women in Science and Engineering) students at a high school and undergraduate level. Activities have included projects that investigated earthquakes, high pressure mineral physics, and local geology. Through a practicum known as Project Java, undergraduate computer science students have developed interactive instructional tools for several of these activities. For K-12 teachers, a course on Long Island geology is offered each fall, which includes an examination of the role that processes in the Earth's interior have played in the geologic history of the region. ESERC has worked with Stony Brook's Department of Geosciences faculty to offer courses on natural hazards, computer modeling, and field geology to undergraduate students, and on computer programming for graduate students. Each summer, a four-week residential college-level environmental geology course is offered to rising tenth graders from the Brentwood, New York schools in partnership with Stony Brook's Department of Technology and Society. During the academic year, a college-level Earth science course is offered to tenth graders from Sayville, New York. In both programs, students conduct research projects as one of their primary responsibilities. In collaboration with the Museum of Long Island Natural Sciences on the Stony Brook campus, two programs have been developed that enable visiting K-12 school classes to investigate earthquakes and phenomena that operate in the Earth's deep interior. From 1997 to 1999, the weekly activity-based Science Enrichment for the Early Years (SEEY) program, focusing on common Earth materials and fundamental Earth processes, was conducted at a local pre-K school. Since 2002, ESERC has worked with the Digital Library for Earth System Education (DLESE) to organize the Skills Workshops for their Annual Meeting and with EarthScope for the development of their Education and Outreach Program Plan. Future education programs and tools developed through COMPRES partnerships will place an increased emphasis on deep Earth materials and phenomena.
Climate Modeling Computing Needs Assessment
NASA Astrophysics Data System (ADS)
Petraska, K. E.; McCabe, J. D.
2011-12-01
This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.
feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology
1992-01-01
The IML-1 mission was the first in a series of Shuttle flights dedicated to fundamental materials and life sciences research with the international partners. The participating space agencies included: NASA, the 14-nation European Space Agency (ESA), the Canadian Space Agency (CSA), the French National Center of Space Studies (CNES), the German Space Agency and the German Aerospace Research Establishment (DAR/DLR), and the National Space Development Agency of Japan (NASDA). Dedicated to the study of life and materials sciences in microgravity, the IML missions explored how life forms adapt to weightlessness and investigated how materials behave when processed in space. Both life and materials sciences benefited from the extended periods of microgravity available inside the Spacelab science module in the cargo bay of the Space Shuttle Orbiter. In this photograph, Commander Ronald J. Grabe works with the Mental Workload and Performance Evaluation Experiment (MWPE) in the IML-1 module. This experiment was designed as a result of difficulty experienced by crewmembers working at a computer station on a previous Space Shuttle mission. The problem was due to the workstation's design being based on Earthbound conditions with the operator in a typical one-G standing position. Information gained from this experiment was used to design workstations for future Spacelab missions and the International Space Station. Managed by the Marshall Space Flight Center, IML-1 was launched on January 22, 1992 aboard the Space Shuttle Orbiter Discovery (STS-42 mission).
NASA Astrophysics Data System (ADS)
Perry, S.; Jordan, T.
2006-12-01
Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.
Computer-aided design and computer science technology
NASA Technical Reports Server (NTRS)
Fulton, R. E.; Voigt, S. J.
1976-01-01
A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
Klimentov, A.; Buncic, P.; De, K.; ...
2015-05-22
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimentov, A.; Buncic, P.; De, K.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
NASA Astrophysics Data System (ADS)
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-07-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.
SOLAS Science and the Environmental Impacts of Geoengineering
NASA Astrophysics Data System (ADS)
Boyd, P.; Law, C. S.
2016-02-01
SOLAS (Surface Ocean Lower Atmosphere Study) has played a major role in establishing the elemental and ecosystem responses in the in situ mesoscale iron addition experiments. The outcomes of these experiments have included a Summary for Policymakers and an amendment on ocean fertilisation in the London Convention on marine dumping, which have informed both the debate and international regulation on this potential geoengineering approach. As part of Future Earth the next ten years of SOLAS Science will develop understanding and fundamental science in 5 major themes, including Greenhouse Gases and the Ocean, Interconnections between Aerosol, Clouds and Ecosystems, and Ocean Biogeochemical Controls on Atmospheric Chemistry. This poster will review the SOLAS science areas that provide fundamental knowledge on processes and ecosystem impacts, which is required for the robust assessment of potential Solar Radiation Management and Carbon Dioxide Removal techniques.
Fundamental approaches in molecular biology for communication sciences and disorders.
Bartlett, Rebecca S; Jetté, Marie E; King, Suzanne N; Schaser, Allison; Thibeault, Susan L
2012-08-01
This contemporary tutorial will introduce general principles of molecular biology, common deoxyribonucleic acid (DNA), ribonucleic acid (RNA), and protein assays and their relevance in the field of communication sciences and disorders. Over the past 2 decades, knowledge of the molecular pathophysiology of human disease has increased at a remarkable pace. Most of this progress can be attributed to concomitant advances in basic molecular biology and, specifically, the development of an ever-expanding armamentarium of technologies for analysis of DNA, RNA, and protein structure and function. Details of these methodologies, their limitations, and examples from the communication sciences and disorders literature are presented. Results/Conclusions The use of molecular biology techniques in the fields of speech, language, and hearing sciences is increasing, facilitating the need for an understanding of molecular biology fundamentals and common experimental assays.
ICASE Computer Science Program
NASA Technical Reports Server (NTRS)
1985-01-01
The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.
Oliveira, Joseph S [Richland, WA; Jones-Oliveira, Janet B [Richland, WA; Bailey, Colin G [Wellington, NZ; Gull, Dean W [Seattle, WA
2008-07-01
One embodiment of the present invention includes a computer operable to represent a physical system with a graphical data structure corresponding to a matroid. The graphical data structure corresponds to a number of vertices and a number of edges that each correspond to two of the vertices. The computer is further operable to define a closed pathway arrangement with the graphical data structure and identify each different one of a number of fundamental cycles by evaluating a different respective one of the edges with a spanning tree representation. The fundamental cycles each include three or more of the vertices.
ERIC Educational Resources Information Center
US Department of Energy, 2007
2007-01-01
The Department of Energy's (DOE) Office of Science is among the world's premier supporters of basic research. The Office of Science enables the U.S. to maintain its competitive edge by funding science that can transform its energy future, supports its national security and seeks to understand the fundamentals of matter and energy itself. To do…
What Constitutes Science and Scientific Evidence: Roles of Null Hypothesis Testing
ERIC Educational Resources Information Center
Chang, Mark
2017-01-01
We briefly discuss the philosophical basis of science, causality, and scientific evidence, by introducing the hidden but most fundamental principle of science: the similarity principle. The principle's use in scientific discovery is illustrated with Simpson's paradox and other examples. In discussing the value of null hypothesis statistical…
Science 20-30: Program of Studies.
ERIC Educational Resources Information Center
Alberta Dept. of Education, Edmonton. Curriculum Branch.
Presented in both English and French, Science 20-30 is an integrated academic program in Alberta, Canada that helps students better understand and apply fundamental concepts and skills common to biology, chemistry, physics, and the Earth sciences. The major goals of the program are: (1) to develop in students an understanding of the…
Science 10: Course of Studies.
ERIC Educational Resources Information Center
Alberta Dept. of Education, Edmonton. Curriculum Branch.
Presented in both English and French, Science 10 is an integrated academic course that helps students in Alberta, Canada better understand and apply fundamental concepts and skills common to biology, chemistry, physics, and the Earth sciences. The major goals of the program are: (1) to develop in students an understanding of the interconnecting…
Bug Talk: A Learning Module on Insect Communication
ERIC Educational Resources Information Center
Bergman, Daniel J.
2008-01-01
The study of insects (entomology) can be used to stimulate students' interest in science and nature. It can develop students' understanding of fundamental science concepts, awareness of interdisciplinary connections, and mastery of science process skills. This teaching module provides opportunities for middle school students (Grades 5-8) to learn…
The Rise and Fall of the Schools Science Project in East Africa.
ERIC Educational Resources Information Center
Lillis, Kevin M.; Lowe, John
1987-01-01
Reviews problems associated with implementing the School Science Project (SSP) in Kenya, and the subsequent downfall of the project in the late 1970s. Concludes that the SSP experience raises fundamental questions about secondary science education in Kenya and the rest of the developing world. (BSR)
Truth in Packaging: Teaching Controversial Topics to Undergraduates in the Human Sciences.
ERIC Educational Resources Information Center
Fredericks, Marcel; Miller, Steven I.
1993-01-01
Argues that the behavioral or "human" sciences are fundamentally different in scope and intent from the natural sciences. Describes the use of controversial topics in undergraduate courses and provides a four-step process. Recommends using Karl Popper's falsification theory to help students think critically about issues. (CFR)
Organizing High School Biology Experiences around Contemporary Bioethical Issues: An STS Approach.
ERIC Educational Resources Information Center
Dass, Pradeep Maxwell
1997-01-01
The need for a citizenry capable of comprehending and tackling contemporary issues related to science and technology demands science education experiences that are fundamentally different from traditional experiences in school science. Argues that high school biology experiences organized around contemporary bioethical issues can meet this need.…
Dan Says - Continuum Magazine | NREL
good science will reward you with unexpected insights - some more profound than you ever could have insights we gain through the basic science we perform is essential to our applied technology R&D, and answer the fundamental questions of science. Their colleagues forge those new insights into workable
From frill to fundamental: The growing importance of science to the tree care industry
E.G. McPherson
2009-01-01
One goal of the ISAâs Science and Research Committee is to increase appreciation and investment in research. This article will explain why your commitment to science and research is vital to your professional growth and the future of arboriculture and urban forestry.
An Academic/Vocational Curriculum Partnership: Home Economics and Science.
ERIC Educational Resources Information Center
Smith, Frances M.; Hausafus, Cheryl O.
1993-01-01
Proposes middle-school curriculum integrating two diverse disciplines (home economics and science), incorporates social issues, and deals with fundamental concerns of young adolescents. Three major areas are included in framework: food additives for appeal, science of textile fibers, and chemistry of household cleaning. All should be taught by…
Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State
ERIC Educational Resources Information Center
Lewis, Colleen Marie
2012-01-01
To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Jordan, T. H.; Minster, B.; Moore, R.; Kesselman, C.; SCEC ITR Collaboration
2004-12-01
The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute, the Incorporated Research Institutions for Seismology, and the U.S. Geological Survey, is developing the Southern California Earthquake Center Community Modeling Environment (CME) under a five-year grant from the National Science Foundation's Information Technology Research (ITR) Program jointly funded by the Geosciences and Computer and Information Science & Engineering Directorates. The CME system is an integrated geophysical simulation modeling framework that automates the process of selecting, configuring, and executing models of earthquake systems. During the Project's first three years, we have performed fundamental geophysical and information technology research and have also developed substantial system capabilities, software tools, and data collections that can help scientist perform systems-level earthquake science. The CME system provides collaborative tools to facilitate distributed research and development. These collaborative tools are primarily communication tools, providing researchers with access to information in ways that are convenient and useful. The CME system provides collaborators with access to significant computing and storage resources. The computing resources of the Project include in-house servers, Project allocations on USC High Performance Computing Linux Cluster, as well as allocations on NPACI Supercomputers and the TeraGrid. The CME system provides access to SCEC community geophysical models such as the Community Velocity Model, Community Fault Model, Community Crustal Motion Model, and the Community Block Model. The organizations that develop these models often provide access to them so it is not necessary to use the CME system to access these models. However, in some cases, the CME system supplements the SCEC community models with utility codes that make it easier to use or access these models. In some cases, the CME system also provides alternatives to the SCEC community models. The CME system hosts a collection of community geophysical software codes. These codes include seismic hazard analysis (SHA) programs developed by the SCEC/USGS OpenSHA group. Also, the CME system hosts anelastic wave propagation codes including Kim Olsen's Finite Difference code and Carnegie Mellon's Hercules Finite Element tool chain. The CME system can execute a workflow, that is, a series of geophysical computations using the output of one processing step as the input to a subsequent step. Our workflow capability utilizes grid-based computing software that can submit calculations to a pool of computing resources as well as data management tools that help us maintain an association between data files and metadata descriptions of those files. The CME system maintains, and provides access to, a collection of valuable geophysical data sets. The current CME Digital Library holdings include a collection of 60 ground motion simulation results calculated by a SCEC/PEER working group and a collection of Greens Functions calculated for 33 TriNet broadband receiver sites in the Los Angeles area.
Scientific Computing Strategic Plan for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiting, Eric Todd
Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less
Changing how and what children learn in school with computer-based technologies.
Roschelle, J M; Pea, R D; Hoadley, C M; Gordin, D N; Means, B M
2000-01-01
Schools today face ever-increasing demands in their attempts to ensure that students are well equipped to enter the workforce and navigate a complex world. Research indicates that computer technology can help support learning, and that it is especially useful in developing the higher-order skills of critical thinking, analysis, and scientific inquiry. But the mere presence of computers in the classroom does not ensure their effective use. Some computer applications have been shown to be more successful than others, and many factors influence how well even the most promising applications are implemented. This article explores the various ways computer technology can be used to improve how and what children learn in the classroom. Several examples of computer-based applications are highlighted to illustrate ways technology can enhance how children learn by supporting four fundamental characteristics of learning: (1) active engagement, (2) participation in groups, (3) frequent interaction and feedback, and (4) connections to real-world contexts. Additional examples illustrate ways technology can expand what children learn by helping them to understand core concepts in subjects like math, science, and literacy. Research indicates, however, that the use of technology as an effective learning tool is more likely to take place when embedded in a broader education reform movement that includes improvements in teacher training, curriculum, student assessment, and a school's capacity for change. To help inform decisions about the future role of computers in the classroom, the authors conclude that further research is needed to identify the uses that most effectively support learning and the conditions required for successful implementation.
A Cognitive Model for Problem Solving in Computer Science
ERIC Educational Resources Information Center
Parham, Jennifer R.
2009-01-01
According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
Defining Computational Thinking for Mathematics and Science Classrooms
ERIC Educational Resources Information Center
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-01-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…
NASA Center for Computational Sciences: History and Resources
NASA Technical Reports Server (NTRS)
2000-01-01
The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.
Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.
Computers in Science: Thinking Outside the Discipline.
ERIC Educational Resources Information Center
Hamilton, Todd M.
2003-01-01
Describes the Computers in Science course which integrates computer-related techniques into the science disciplines of chemistry, physics, biology, and Earth science. Uses a team teaching approach and teaches students how to solve chemistry problems with spreadsheets, identify minerals with X-rays, and chemical and force analysis. (Contains 14…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting SUMMARY: As a result of the impact of the recent government shutdown, the... Committee for Computer and Information Science and Engineering meeting. The public notice for this committee...
Exemplary Science Teachers' Use of Technology
ERIC Educational Resources Information Center
Hakverdi-Can, Meral; Dana, Thomas M.
2012-01-01
The purpose of this study is to examine exemplary science teachers' level of computer use, their knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, how often they required their students to use those applications in or for their science class…
PREFACE: 3rd Iberian Meeting on Aerosol Science and Technology (RICTA 2015)
NASA Astrophysics Data System (ADS)
Orza, J. A. G.; Costa, M. J.
2015-12-01
The Third Iberian Meeting on Aerosol Science and Technology (RICTA 2015) was held in the city of Elche (province of Alicante, Spain) from 29 June to 1 July 2015, at Centro de Congresos Ciutat d'Elx. This event was organized and hosted by the Statistical and Computational Physics Laboratory (SCOLAb) of Universidad Miguel Hernández under the auspices of AECyTA, the Spanish Association for Aerosol Science and Technology Research. As in previous editions, the participation of young researchers was especially welcome, with the organization of the VI Summer School on Aerosol Science and Technology and awards for the best poster and PhD thesis, in recognition of outstanding research or presentations focusing on aerosols, during the early stage of their scientific career. RICTA 2015 aims to present the latest research and advances on the field of aerosols, as well as fostering interaction among the Portuguese and Spanish communities. The meeting gathered over 70 participants from 7 different countries, covering a wide range of aerosol science and technology. It included invited lectures, keynote talks, and several specialized sessions on different issues related to atmospheric aerosols, radiation, instrumentation, fundamental aerosol science, bioaerosols and health effects. The editors would like to express their sincere gratitude to all the participants, in particular, those who contributed to this special issue by submitting their papers to convey the current science discussed at RICTA 2015. In this special issue a series of peer-reviewed papers that cover a wide range of topics are presented: aerosol formation, emission, as well as aerosol composition in terms of physical and optical properties, spatial/temporal distribution of aerosol parameters, aerosol modeling and atmospheric effects, as well as instrumentation devoted to aerosol measurements. Finally, we also thank the referees for their valuable revision of these papers.
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1993-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1992-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
National Plant Genome Initiative: 2003-2008
2003-01-01
maize, wheat, barley and sorghum. ! New fundamental science discoveries including: (1) the structure and organization of centromeres in higher plants ...JAN 2003 2. REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE National Plant Genome Initiative: 2003-2008 5a...National Science Foundation. National Plant Genome Initiative: 2003 - 2008 National Science and Technology Council Committee on Science Interagency
ERIC Educational Resources Information Center
Gottfried, Adele Eskeles; Preston, Kathleen Suzanne Johnson; Gottfried, Allen W.; Oliver, Pamella H.; Delany, Danielle E.; Ibrahim, Sirena M.
2016-01-01
Curiosity is fundamental to scientific inquiry and pursuance. Parents are important in encouraging children's involvement in science. This longitudinal study examined pathways from parental stimulation of children's curiosity per se to their science acquisition (SA). A latent variable of SA was indicated by the inter-related variables of high…
Reading for Meaning: The Foundational Knowledge Every Teacher of Science Should Have
ERIC Educational Resources Information Center
Patterson, Alexis; Roman, Diego; Friend, Michelle; Osborne, Jonathan; Donovan, Brian
2018-01-01
Reading is fundamental to science and not an adjunct to its practice. In other words, understanding the meaning of the various forms of written discourse employed in the creation, discussion, and communication of scientific knowledge is inherent to how science works. The language used in science, however, sets up a barrier, that in order to be…
ERIC Educational Resources Information Center
DiLullo, Camille; Morris, Harry J.; Kriebel, Richard M.
2009-01-01
Understanding the relevance of basic science knowledge in the determination of patient assessment, diagnosis, and treatment is critical to good medical practice. One method often used to direct students in the fundamental process of integrating basic science and clinical information is problem-based learning (PBL). The faculty facilitated small…
Fundamental insights into interfacial catalysis.
Gong, Jinlong; Bao, Xinhe
2017-04-03
Surface and interfacial catalysis plays a vital role in chemical industries, electrochemistry and photochemical reactions. The challenges of modern chemistry are to optimize the chemical reaction processes and understand the detailed mechanism of chemical reactions. Since the early 1960s, the foundation of surface science systems has allowed the study of surface and interfacial phenomena on atomic/molecular level, and thus brought a number of significant developments to fundamental and technological processes, such as catalysis, material science and biochemistry, just to name a few. This themed issue describes the recent advances and developments in the fundamental understanding of surface and interfacial catalysis, encompassing areas of knowledge from metal to metal oxide, carbide, graphene, hexagonal boron nitride, and transition metal dichalcogenides under ultrahigh vacuum conditions, as well as under realistic reaction conditions.
ERIC Educational Resources Information Center
Science and Children, 1990
1990-01-01
Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…
NASA Astrophysics Data System (ADS)
Iadecola, Thomas; Schuster, Thomas; Chamon, Claudio
The possibility that anyons -- quantum particles other than fermions or bosons -- can emerge in condensed matter systems has motivated generations of physicists. In addition to being of fundamental scientific importance, so-called non-Abelian anyons are particularly sought-after for potential applications to quantum computing. However, experimental evidence of anyons in electronic systems remains inconclusive. We propose to demonstrate non-Abelian braiding by injecting coherent states of light into ``topological guided modes'' in specially-fabricated photonic waveguide arrays. These modes are photonic analogues of topological zero modes in electronic systems. Light traveling inside spatially well-separated topological guided modes can be braided, leading to the accumulation of non-Abelian phases. We propose an optical interference experiment to probe this non-Abelian braiding directly. T.I. is supported by a National Science Foundation Graduate Research Fellowship under Grant No. DGE-1247312.
Thermodynamics of freezing and melting
Pedersen, Ulf R.; Costigliola, Lorenzo; Bailey, Nicholas P.; Schrøder, Thomas B.; Dyre, Jeppe C.
2016-01-01
Although the freezing of liquids and melting of crystals are fundamental for many areas of the sciences, even simple properties like the temperature–pressure relation along the melting line cannot be predicted today. Here we present a theory in which properties of the coexisting crystal and liquid phases at a single thermodynamic state point provide the basis for calculating the pressure, density and entropy of fusion as functions of temperature along the melting line, as well as the variation along this line of the reduced crystalline vibrational mean-square displacement (the Lindemann ratio), and the liquid's diffusion constant and viscosity. The framework developed, which applies for the sizable class of systems characterized by hidden scale invariance, is validated by computer simulations of the standard 12-6 Lennard-Jones system. PMID:27530064
Creativity in tuberculosis research and discovery.
Younga, Douglas; Verreck, Frank A W
2012-03-01
The remarkable advances in TB vaccinology over the last decade have been driven by a pragmatic approach to moving candidates along the development pipeline to clinical trials, fuelled by encouraging data on protection in animal models. Efficacy data from Phase IIb trials of the first generation of new candidates are anticipated over the next 1-2 years. As outlined in the TB Vaccines Strategic Blueprint, to exploit this information and to inspire design of next generation candidates, it is important that this empirical approach is complemented by progress in understanding of fundamental immune mechanisms and improved translational modalities. Current trends towards improved experimental and computational approaches for studying biological complexity will be an important element in the developing science of TB vaccinology. Copyright © 2012 Elsevier Ltd. All rights reserved.
VanderWeele, Tyler J.; Staudt, Nancy
2014-01-01
In this paper we introduce methodology—causal directed acyclic graphs—that empirical researchers can use to identify causation, avoid bias, and interpret empirical results. This methodology has become popular in a number of disciplines, including statistics, biostatistics, epidemiology and computer science, but has yet to appear in the empirical legal literature. Accordingly we outline the rules and principles underlying this new methodology and then show how it can assist empirical researchers through both hypothetical and real-world examples found in the extant literature. While causal directed acyclic graphs are certainly not a panacea for all empirical problems, we show they have potential to make the most basic and fundamental tasks, such as selecting covariate controls, relatively easy and straightforward. PMID:25685055
NASA Technical Reports Server (NTRS)
Estes, J. E.; Eisgruber, L.
1981-01-01
Important points presented and recommendations made at an information and decision processes workshop held in Asilomar, California; at a data and information performance workshop held in Houston, Texas; and at a data base use and management workshop held near San Jose, California are summarized. Issues raised at a special session of the Soil Conservation Society of America's remote sensing for resource management conference in Kansas City, Missouri are also highlighted. The goals, status and activities of the NASA program definition study of basic research requirements, the necessity of making the computer science community aware of user needs with respect to information related to renewable resources, performance parameters and criteria for judging federal information systems, and the requirements and characteristics of scientific data bases are among the topics reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chantler, C.T.
2003-01-24
Richard Deslattes passed away on 16 May 2001 after a life dedicated to fundamental metrology. Although the themes of calibrating light, matter and fundamental constants can give three guiding principles through his career, the wide-ranging nature of his areas of interest are encompassed by over 165 refereed publications with several cited over 100 times. He has left an enduring legacy to science.
An Overview of NASA's Intelligent Systems Program
NASA Technical Reports Server (NTRS)
Cooke, Daniel E.; Norvig, Peter (Technical Monitor)
2001-01-01
NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.
ERIC Educational Resources Information Center
Boody, Charles G., Ed.
1986-01-01
Six articles on music and computing address development of computer-based music technology, computer assisted instruction (CAI) in ear training and music fundamentals, a machine-independent data structure for musical pitch relationship representation, touch tablet input device in a melodic dictation CAI game, and systematic evaluation strategies…
A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.
ERIC Educational Resources Information Center
Deek, Fadi P.; Kimmel, Howard
2002-01-01
Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)
A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…
Making Advanced Computer Science Topics More Accessible through Interactive Technologies
ERIC Educational Resources Information Center
Shao, Kun; Maher, Peter
2012-01-01
Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…
ASCR Workshop on Quantum Computing for Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward
This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less
BIOCOMPUTATION: some history and prospects.
Cull, Paul
2013-06-01
At first glance, biology and computer science are diametrically opposed sciences. Biology deals with carbon based life forms shaped by evolution and natural selection. Computer Science deals with electronic machines designed by engineers and guided by mathematical algorithms. In this brief paper, we review biologically inspired computing. We discuss several models of computation which have arisen from various biological studies. We show what these have in common, and conjecture how biology can still suggest answers and models for the next generation of computing problems. We discuss computation and argue that these biologically inspired models do not extend the theoretical limits on computation. We suggest that, in practice, biological models may give more succinct representations of various problems, and we mention a few cases in which biological models have proved useful. We also discuss the reciprocal impact of computer science on biology and cite a few significant contributions to biological science. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.