ERIC Educational Resources Information Center
Jacobson, Michael J.; Taylor, Charlotte E.; Richards, Deborah
2016-01-01
In this paper, we propose computational scientific inquiry (CSI) as an innovative model for learning important scientific knowledge and new practices for "doing" science. This approach involves the use of a "game-like" virtual world for students to experience virtual biological fieldwork in conjunction with using an agent-based…
Democratizing Computer Science
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Ryoo, Jean J.
2015-01-01
Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…
ERIC Educational Resources Information Center
Salameh, Ibrahim Abdul Ghani; Khawaldeh, Mohammad Falah Ali
2014-01-01
The Study aimed at identifying the trends of the students of basic sciences College in the World Islamic Sciences and Education University towards teaching health and sport course by using computer technology as a teaching method, and to identify also the impact of the variables of academic level and the gender on the students' trends. The study…
Girls Save the World through Computer Science
ERIC Educational Resources Information Center
Murakami, Christine
2011-01-01
It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…
Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study
ERIC Educational Resources Information Center
Herling, Lourdes
2011-01-01
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…
Design Principles for "Thriving in Our Digital World": A High School Computer Science Course
ERIC Educational Resources Information Center
Veletsianos, George; Beth, Bradley; Lin, Calvin; Russell, Gregory
2016-01-01
"Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…
Students Develop Real-World Web and Pervasive Computing Systems.
ERIC Educational Resources Information Center
Tappert, Charles C.
In the academic year 2001-2002, Pace University (New York) Computer Science and Information Systems (CSIS) students developed real-world Web and pervasive computing systems for actual customers. This paper describes the general use of team projects in CSIS at Pace University, the real-world projects from this academic year, the benefits of…
Science Photo of person viewing 3D visualization of a wind turbine The NREL Computational Science challenges in fields ranging from condensed matter physics and nonlinear dynamics to computational fluid dynamics. NREL is also home to the most energy-efficient data center in the world, featuring Peregrine-the
Representing, Running, and Revising Mental Models: A Computational Model
ERIC Educational Resources Information Center
Friedman, Scott; Forbus, Kenneth; Sherin, Bruce
2018-01-01
People use commonsense science knowledge to flexibly explain, predict, and manipulate the world around them, yet we lack computational models of how this commonsense science knowledge is represented, acquired, utilized, and revised. This is an important challenge for cognitive science: Building higher order computational models in this area will…
NASA Technical Reports Server (NTRS)
1987-01-01
The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.
Real-World Neuroimaging Technologies
2013-05-10
system enables long-term wear of up to 10 consecutive hours of operation time. The system’s wireless technologies, light weight (200g), and dry sensor ...biomarkers, body sensor networks , brain computer interactionbrain, computer interfaces, data acquisition, electroencephalography monitoring, translational...brain activity in real-world scenarios. INDEX TERMS Behavioral science, biomarkers, body sensor networks , brain computer interfaces, brain computer
ERIC Educational Resources Information Center
Przybylla, Mareen; Romeike, Ralf
2014-01-01
Physical computing covers the design and realization of interactive objects and installations and allows students to develop concrete, tangible products of the real world, which arise from the learners' imagination. This can be used in computer science education to provide students with interesting and motivating access to the different topic…
Neal Lane: Science in a Flat World
Lane, Neal
2017-12-22
Lane discusses the changes that have taken place in the world since World War II that have made it "flatter," referring to Thomas L. Friedman's book, The World is Flat. Friedman's main premise is that inexpensive telecommunications is bringing about unhampered international competition, the demise of economic stability, and a trend toward outsourcing services, such as computer programming, engineering and science research.
Materials inspired by mathematics.
Kotani, Motoko; Ikeda, Susumu
2016-01-01
Our world is transforming into an interacting system of the physical world and the digital world. What will be the materials science in the new era? With the rising expectations of the rapid development of computers, information science and mathematical science including statistics and probability theory, 'data-driven materials design' has become a common term. There is knowledge and experience gained in the physical world in the form of know-how and recipes for the creation of material. An important key is how we establish vocabulary and grammar to translate them into the language of the digital world. In this article, we outline how materials science develops when it encounters mathematics, showing some emerging directions.
ERIC Educational Resources Information Center
What Works Clearinghouse, 2012
2012-01-01
"Technology Enhanced Elementary and Middle School Science" ("TEEMSS") is a physical science curriculum for grades 3-8 that utilizes computers, sensors, and interactive models to support investigations of real-world phenomena. Through 15 inquiry-based instructional units, students interact with computers, gather and analyze…
ERIC Educational Resources Information Center
Haberman, Bruria; Yehezkel, Cecile
2008-01-01
The rapid evolvement of the computing domain has posed challenges in attempting to bridge the gap between school and the contemporary world of computing, which is related to content, learning culture, and professional norms. We believe that the interaction of high-school students who major in computer science or software engineering with leading…
Materials inspired by mathematics
Kotani, Motoko; Ikeda, Susumu
2016-01-01
Abstract Our world is transforming into an interacting system of the physical world and the digital world. What will be the materials science in the new era? With the rising expectations of the rapid development of computers, information science and mathematical science including statistics and probability theory, ‘data-driven materials design’ has become a common term. There is knowledge and experience gained in the physical world in the form of know-how and recipes for the creation of material. An important key is how we establish vocabulary and grammar to translate them into the language of the digital world. In this article, we outline how materials science develops when it encounters mathematics, showing some emerging directions. PMID:27877877
Computational Science and Innovation
NASA Astrophysics Data System (ADS)
Dean, D. J.
2011-09-01
Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.
Deda, H; Yakupoglu, H
2002-01-01
Science must have a common language. For centuries, Latin language carried out this job, but the progress in computer technology and internet world through the last 20 years, began to produce a new language with the new century; the computer language. The information masses, which need data language standardization, are the followings; Digital libraries and medical education systems, Consumer health informatics, Medical education systems, World Wide Web Applications, Database systems, Medical language processing, Automatic indexing systems, Image processing units, Telemedicine, New Generation Internet (NGI).
Using NCLab-karel to improve computational thinking skill of junior high school students
NASA Astrophysics Data System (ADS)
Kusnendar, J.; Prabawa, H. W.
2018-05-01
Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.
JPRS Report, Science & Technology, USSR: Computers.
1988-07-08
Computers DISTRIBUTION STAfpiEFTX Approved !CJ- public vekrase; Distribution Unla;u;ed DTIC QUALITY INSPECTED S REPRODUCED BY U.S. DEPARTMENT OF...COMMERCE National Technical Information Service SPRINGFIELD, VA. 22161 /O o f\\H JPRS-UCC-88-002 8 JULY 1988 SCIENCE & TECHNOLOGY USSR: COMPUTERS ...CONTENTS GENERAL Computers : Steps to the World Level ,nm^VA „ 17Q (V. Kovalenko; SOTSIAL1STICIIESKAYA INDUSTRIYA, No 178, 4 Aug 87
ERIC Educational Resources Information Center
Ryoo, Jean Jinsun
2013-01-01
Computing occupations are among the fastest growing in the U.S. and technological innovations are central to solving world problems. Yet only our most privileged students are learning to use technology for creative purposes through rigorous computer science education opportunities. In order to increase access for diverse students and females who…
Scientific and Technological Progress: Problems for the West.
ERIC Educational Resources Information Center
de Rose, Francois
1978-01-01
Discusses the impact of science and technology on major social problems confronting the Western world. Topics include pollution and ecology, military impact, computer science, and the benefits of science and technology. (Author/MA)
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
NASA Astrophysics Data System (ADS)
Cataldo, Franca
The world is at the dawn of a third industrial revolution, the digital revolution, that brings great changes the world over. Today, computing devices, the Internet, and the World Wide Web are vital technology tools that affect every aspect of everyday life and success. While computing technologies offer enormous benefits, there are equally enormous safety and security risks that have been growing exponentially since they became widely available to the public in 1994. Cybercriminals are increasingly implementing sophisticated and serious hack attacks and breaches upon our nation's government, financial institutions, organizations, communities, and private citizens. There is a great need for computer scientists to carry America's innovation and economic growth forward and for cybersecurity professionals to keep our nation safe from criminal hacking. In this digital age, computer science and cybersecurity are essential foundational ingredients of technological innovation, economic growth, and cybersecurity that span all industries. Yet, America's K-12 education institutions are not teaching the computer science and cybersecurity skills required to produce a technologically-savvy 21st century workforce. Education is the key to preparing students to enter the workforce and, therefore, American K-12 STEM education must be reformed to accommodate the teachings required in the digital age. Keywords: Cybersecurity Education, Cybersecurity Education Initiatives, Computer Science Education, Computer Science Education Initiatives, 21 st Century K-12 STEM Education Reform, 21st Century Digital Literacies, High-Tech Innovative Problem-Solving Skills, 21st Century Digital Workforce, Standardized Testing, Foreign Language and Culture Studies, Utica College, Professor Chris Riddell.
The scientific research potential of virtual worlds.
Bainbridge, William Sims
2007-07-27
Online virtual worlds, electronic environments where people can work and interact in a somewhat realistic manner, have great potential as sites for research in the social, behavioral, and economic sciences, as well as in human-centered computer science. This article uses Second Life and World of Warcraft as two very different examples of current virtual worlds that foreshadow future developments, introducing a number of research methodologies that scientists are now exploring, including formal experimentation, observational ethnography, and quantitative analysis of economic markets or social networks.
Are Computer Science Students Ready for the Real World.
ERIC Educational Resources Information Center
Elliot, Noreen
The typical undergraduate program in computer science includes an introduction to hardware and operating systems, file processing and database organization, data communication and networking, and programming. However, many graduates may lack the ability to integrate the concepts "learned" into a skill set and pattern of approaching problems that…
The role of physicality in rich programming environments
NASA Astrophysics Data System (ADS)
Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin
2013-12-01
Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot Virtual Worlds (RVWs), can be used to teach computer science principles within a robotics context by examining its use in high-school classrooms. We also investigated whether the lack of physicality in these environments impacts student learning by comparing classrooms that used either virtual or physical robots for the RVW curriculum. Results suggest that the RVW environment leads to significant gains in computer science knowledge, that virtual robots lead to faster learning, and that physical robots may have some influence on algorithmic thinking. We discuss the implications of physicality in these programming environments for learning computer science.
Translations on USSR Science and Technology Physical Sciences and Technology No. 18
1977-09-19
and Avetik Gukasyan discuss component arrangement alternatives. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND...1974. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND AUTOMATION TECHNOLOGY ’PROYEKC’ COMPUTER-ASSISTED DESIGN SYSTEM...throughout the world are struggling. The "Proyekt" system, produced in the Institute of Cybernetics, assists in automating the design and manufacture of
NASA Technical Reports Server (NTRS)
Salmon, Ellen
1996-01-01
The data storage and retrieval demands of space and Earth sciences researchers have made the NASA Center for Computational Sciences (NCCS) Mass Data Storage and Delivery System (MDSDS) one of the world's most active Convex UniTree systems. Science researchers formed the NCCS's Computer Environments and Research Requirements Committee (CERRC) to relate their projected supercomputing and mass storage requirements through the year 2000. Using the CERRC guidelines and observations of current usage, some detailed projections of requirements for MDSDS network bandwidth and mass storage capacity and performance are presented.
2010-10-18
August 2010 was building the right game “ – World of Warcraft has 30% women (according to womengamers.com) Conclusion: – We don’t really understand why...Report of the National Academies on Informal Learning • Infancy - late adulthood: Learn about the world & develop important skills for science...Education With Rigor and Vigor – Excitement, interest, and motivation to learn about phenomena in the natural and physical world . – Generate
ERIC Educational Resources Information Center
Farhangi, Sanaz
2012-01-01
This paper presents a review of Jane McGonigal's book, "Reality is broken" (Reality is broken: why games make us better and how they can change the world. Penguin Press, New York, 2011). As the book subtitle suggests it is a book about "why games make us better and how they can change the world", written by a specialist in computer game design. I…
Real science at the petascale.
Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V
2009-06-28
We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.
Teaching Mixed-Mode: A Case Study in Remote Delivery of Computer Science in Africa
ERIC Educational Resources Information Center
Howell, Sheila; Harris, Michael; Wilkinson, Simon; Zuluaga, Catherine; Voutier, Paul
2004-01-01
In February 2003, RMIT University in Melbourne, Australia, commenced delivery of a Computer Science diploma and degree programme using mixed mode delivery to 250 university students in sub-Saharan Africa, through a World Bank funded project designed for the African Virtual University (AVU). The project is a unique experience made possible by…
Once She Makes It, She's There!: A Case Study
ERIC Educational Resources Information Center
Gal-Ezer, Judith; Vilner, Tamar; Zur, Ela
2008-01-01
Computer science is possibly one of the few remaining disciplines almost entirely dominated by men, especially university staff and in the hi-tech industries. This phenomenon prevails throughout the western world; in Israel it starts in high school, where only 30% of students who choose to take computer science as an elective are women, and…
JPRS Report, Science & Technology, USSR: Computers
1987-09-28
history anew, "Battle of 1917" would come to your service. If you wish to control the destinies of nations, play around with the third world war ...engineering world . The time has come to return to this profession the romantic halo that once clearly and suitably shined, but that has now been almost...at bringing all of the computer technology produced in the association up to world standards within this five-year-plan. The creative potential of
Computer Supported Cooperative Work in Information Search and Retrieval.
ERIC Educational Resources Information Center
Twidale, Michael B.; Nichols, David M.
1998-01-01
Considers how research in collaborative technologies can inform research and development in library and information science. Topics include computer supported collaborative work; shared drawing; collaborative writing; MUDs; MOOs; workflow; World Wide Web; collaborative learning; computer mediated communication; ethnography; evaluation; remote…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moniz, Ernest; Carr, Alan; Bethe, Hans
The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advancedmore » supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.« less
Moniz, Ernest; Carr, Alan; Bethe, Hans; Morrison, Phillip; Ramsay, Norman; Teller, Edward; Brixner, Berlyn; Archer, Bill; Agnew, Harold; Morrison, John
2018-01-16
The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of todayâs advanced supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.
Simplified Key Management for Digital Access Control of Information Objects
2016-07-02
0001, Task BC-5-2283, “Architecture, Design of Services for Air Force Wide Distributed Systems,” for USAF HQ USAF SAF/CIO A6. The views, opinions...Challenges for Cloud Computing,” Lecture Notes in Engineering and Computer Science: Proceedings World Congress on Engineering and Computer Science 2011...P. Konieczny USAF HQ USAF SAF/CIO A6 11. SPONSOR’S / MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public
NASA Astrophysics Data System (ADS)
Khalili, N.; Valliappan, S.; Li, Q.; Russell, A.
2010-07-01
The use for mathematical models of natural phenomena has underpinned science and engineering for centuries, but until the advent of modern computers and computational methods, the full utility of most of these models remained outside the reach of the engineering communities. Since World War II, advances in computational methods have transformed the way engineering and science is undertaken throughout the world. Today, theories of mechanics of solids and fluids, electromagnetism, heat transfer, plasma physics, and other scientific disciplines are implemented through computational methods in engineering analysis, design, manufacturing, and in studying broad classes of physical phenomena. The discipline concerned with the application of computational methods is now a key area of research, education, and application throughout the world. In the early 1980's, the International Association for Computational Mechanics (IACM) was founded to promote activities related to computational mechanics and has made impressive progress. The most important scientific event of IACM is the World Congress on Computational Mechanics. The first was held in Austin (USA) in 1986 and then in Stuttgart (Germany) in 1990, Chiba (Japan) in 1994, Buenos Aires (Argentina) in 1998, Vienna (Austria) in 2002, Beijing (China) in 2004, Los Angeles (USA) in 2006 and Venice, Italy; in 2008. The 9th World Congress on Computational Mechanics is held in conjunction with the 4th Asian Pacific Congress on Computational Mechanics under the auspices of Australian Association for Computational Mechanics (AACM), Asian Pacific Association for Computational Mechanics (APACM) and International Association for Computational Mechanics (IACM). The 1st Asian Pacific Congress was in Sydney (Australia) in 2001, then in Beijing (China) in 2004 and Kyoto (Japan) in 2007. The WCCM/APCOM 2010 publications consist of a printed book of abstracts given to delegates, along with 247 full length peer reviewed papers published with free access online in IOP Conference Series: Materials Science and Engineering. The editors acknowledge the help of the paper reviewers in maintaining a high standard of assessment and the co-operation of the authors in complying with the requirements of the editors and the reviewers. We also would like to take this opportunity to thank the members of the Local Organising Committee and the International Scientific Committee for helping make WCCM/APCOM 2010 a successful event. We also thank The University of New South Wales, The University of Newcastle, the Centre for Infrastructure Engineering and Safety (CIES), IACM, APCAM, AACM for their financial support, along with the United States Association for Computational Mechanics for the Travel Awards made available. N. Khalili S. Valliappan Q. Li A. Russell 19 July 2010 Sydney, Australia
Learning Science through Computer Games and Simulations
ERIC Educational Resources Information Center
Honey, Margaret A., Ed.; Hilton, Margaret, Ed.
2011-01-01
At a time when scientific and technological competence is vital to the nation's future, the weak performance of U.S. students in science reflects the uneven quality of current science education. Although young children come to school with innate curiosity and intuitive ideas about the world around them, science classes rarely tap this potential.…
ERIC Educational Resources Information Center
Donnelly, Dermot F.; Linn, Marcia C.; Ludvigsen, Sten
2014-01-01
The National Science Foundation-sponsored report "Fostering Learning in the Networked World" called for "a common, open platform to support communities of developers and learners in ways that enable both to take advantage of advances in the learning sciences." We review research on science inquiry learning environments (ILEs)…
Information Processing Research
1979-06-01
quantitative shape recovery. For the qualitative shape recovery we use a model of the Origami world (Kanade, 1978), together with edge profiles of...Workshop. Carnegie-Mellon University, Computer Science Department, Pittsburgh, PA, November, 1978. Kanade, T. A theory of origami world. Technical
ERIC Educational Resources Information Center
L'Homme, Marie-Claude
The evolution of "language utilities," a concept confined largely to the francophone world and relating to the uses of language in computer science and the use of computer science for languages, is chronicled. The language utilities are of three types: (1) tools for language development, primarily dictionary databases and related tools;…
Women in Computer Sciences in Romania: Success and Sacrifice
ERIC Educational Resources Information Center
Ward, Kelly; Dragne, Cornelia; Lucas, Angelina J.
2014-01-01
The purpose of this article is to more fully understand the professional lives of women academics in computer sciences in six Romanian universities. The work is exploratory and relies on a qualitative framework to more fully understand what it means to be a woman academic in high-tech disciplines in a second world economy. We conducted in-depth,…
ERIC Educational Resources Information Center
Ferreira, Deller James; Ambrósio, Ana Paula Laboissière; Melo, Tatiane F. N.
2018-01-01
This article describes how it is due to the fact that computer science is present in many activities of daily life, students need to develop skills to solve problems to improve the lives of people in general. This article investigates correlations between teachers' motivational orientations, beliefs and practices with respect to the application of…
Texas Agricultural Science Teachers' Attitudes toward Information Technology
ERIC Educational Resources Information Center
Anderson, Ryan; Williams, Robert
2012-01-01
The researchers sought to find the Agricultural Science teachers' attitude toward five innovations (Computer-Aided Design, Record Books, E-Mail Career Development Event Registration, and World Wide Web) of information technology. The population for this study consisted of all 333 secondary Agricultural science teachers from Texas FFA Areas V and…
Astrobiology for the 21st Century
NASA Astrophysics Data System (ADS)
Oliveira, C.
2008-02-01
We live in a scientific world. Science is all around us. We take scientific principles for granted every time we use a piece of technological apparatus, such as a car, a computer, or a cellphone. In today's world, citizens frequently have to make decisions that require them to have some basic scientific knowledge. To be a contributing citizen in a modern democracy, a person needs to understand the general principles of science.
Integrating an Intelligent Tutoring System for TAOs with Second Life
2010-12-01
SL) and interacts with a number of computer -controlled objects that take on the roles of the TAO’s teammates. TAOs rely on the same mechanism to...projects that utilize both game and simulation technology for training. He joined Stottler Henke in the fall of 2000 and holds a Ph.D. in computer science...including implementing tutors in multiuser worlds. He has been at Stottler Henke since 2005 and has a MS in computer science from Stanford University
ERIC Educational Resources Information Center
Sherin, Bruce
2013-01-01
A large body of research in the learning sciences has focused on students' commonsense science knowledge--the everyday knowledge of the natural world that is gained outside of formal instruction. Although researchers studying commonsense science have employed a variety of methods, 1-on-1 clinical interviews have played a unique role. The data…
NASA Astrophysics Data System (ADS)
Farhangi, Sanaz
2012-12-01
This paper presents a review of Jane McGonigal's book, "Reality is broken" (Reality is broken: why games make us better and how they can change the world. Penguin Press, New York, 2011). As the book subtitle suggests it is a book about "why games make us better and how they can change the world", written by a specialist in computer game design. I will try to show the relevance this book might have to science educators through emphasizing the points that the author offers as the fixes to rebuild reality on the image of gaming world. Using cultural-historical activity theory, I will explore how taking up a gamer mindset can challenge one to consider shortcomings in current approaches to the activity of teaching-learning science and how using this mindset can open our minds to think of new ways of engaging in the activity of doing science. I hope this review will encourage educators to explore the worldview presented in the book and use it to transform our thinking about science education.
Conversational Agents in Virtual Worlds: Bridging Disciplines
ERIC Educational Resources Information Center
Veletsianos, George; Heller, Robert; Overmyer, Scott; Procter, Mike
2010-01-01
This paper examines the effective deployment of conversational agents in virtual worlds from the perspective of researchers/practitioners in cognitive psychology, computing science, learning technologies and engineering. From a cognitive perspective, the major challenge lies in the coordination and management of the various channels of information…
COMPUTER-AIDED SCIENCE POLICY ANALYSIS AND RESEARCH (WEBCASPAR)
WebCASPAR is a database system containing information about academic science and engineering resources and is available on the World Wide Web. Included in the database is information from several of SRS's academic surveys plus information from a variety of other sources, includin...
Enabling Analytics in the Cloud for Earth Science Data
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Christopher; Bingham, Andrew W.; Quam, Brandi M.
2018-01-01
The purpose of this workshop was to hold interactive discussions where providers, users, and other stakeholders could explore the convergence of three main elements in the rapidly developing world of technology: Big Data, Cloud Computing, and Analytics, [for earth science data].
Cognitive science as an interface between rational and mechanistic explanation.
Chater, Nick
2014-04-01
Cognitive science views thought as computation; and computation, by its very nature, can be understood in both rational and mechanistic terms. In rational terms, a computation solves some information processing problem (e.g., mapping sensory information into a description of the external world; parsing a sentence; selecting among a set of possible actions). In mechanistic terms, a computation corresponds to causal chain of events in a physical device (in engineering context, a silicon chip; in biological context, the nervous system). The discipline is thus at the interface between two very different styles of explanation--as the papers in the current special issue well illustrate, it explores the interplay of rational and mechanistic forces. Copyright © 2014 Cognitive Science Society, Inc.
Design Science in Human-Computer Interaction: A Model and Three Examples
ERIC Educational Resources Information Center
Prestopnik, Nathan R.
2013-01-01
Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…
Assessing Practical Skills in Physics Using Computer Simulations
ERIC Educational Resources Information Center
Walsh, Kevin
2018-01-01
Computer simulations have been used very effectively for many years in the teaching of science but the focus has been on cognitive development. This study, however, is an investigation into the possibility that a student's experimental skills in the real-world environment can be judged via the undertaking of a suitably chosen computer simulation…
ERIC Educational Resources Information Center
Estes, Charles R.
1994-01-01
Discusses theoretical versus applied science and the use of the scientific method for analysis of social issues. Topics addressed include the use of simulation and modeling; the growth in computer power, including nanotechnology; distributed computing; self-evolving programs; spiritual matters; human engineering, i.e., molding individuals;…
2013-12-10
Edward A. Lee Björn Hartmann Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-2013-200...NAME(S) AND ADDRESS(ES) University of California at Berkeley, Electrical Engineering and Computer Sciences,Berkeley,CA,94720 8. PERFORMING...movement. PHYSICAL TARGET ACQUISITION STUDY To understand the accuracy and performance of head- orientation-based selection through our device, we car - ried
2013-11-04
Edward A. Lee Björn Hartmann Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-2013-182...NAME(S) AND ADDRESS(ES) University of California at Berkeley, Electrical Engineering and Computer Sciences,Berkeley,CA,94720 8. PERFORMING...accuracy and performance of head- orientation-based selection through our device, we car - ried out a comparative target acquisition study, where
Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.
Stein, Lincoln D
2008-09-01
Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.
NASA Advanced Supercomputing Facility Expansion
NASA Technical Reports Server (NTRS)
Thigpen, William W.
2017-01-01
The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.
The Future of K-12 Computer Science Instruction
ERIC Educational Resources Information Center
Bottoms, Gene; Sundell, Kirsten
2016-01-01
Children born since the early 1990s have never known a world in which computer and information technologies are not essential to every aspect of their lives. However, far too many young people, especially low-income and minority youth, lack opportunities to learn about the impact of computer and information technologies on their lives and become…
An Ethernet Java Applet for a Course for Non-Majors.
ERIC Educational Resources Information Center
Holliday, Mark A.
1997-01-01
Details the topics of a new course that introduces computing and communication technology to students not majoring in computer science. Discusses the process of developing a Java applet (a program that can be invoked through a World Wide Web browser) that illustrates the protocol used by ethernet local area networks to determine which computer can…
The Influence of an Educational Computer Game on Children's Cultural Identities
ERIC Educational Resources Information Center
Chen, Hsiang-Ping; Lien, Chi-Jui; Annetta, Len; Lu, Yu-Ling
2010-01-01
This study develops an educational computer game, FORmosaHope (FH), to explore the influences that an educational computer game might have on children's cultural identities. FH is a role-playing game, in which children can actively explore a mini-world to learn about science, technology, and society. One hundred and thirty sixth-graders, about…
A Journey from the Sun to the Earth
ERIC Educational Resources Information Center
Psycharis, Sarantos; Daflos, Athanasios
2005-01-01
Computer-aided modelling and investigations can bring the real world into classrooms and facilitate its exploration, in contrast to acquiring factual knowledge from textbooks. Computer modelling puts a whole new "spin" on science education, redefining and reshaping the classroom learning experience. The authors used information and…
Science and Social Science in a World Perspective.
ERIC Educational Resources Information Center
Morrissett, Irving
While notable advances in astronomy, nuclear physics, microbiology, and computer technology seem to contribute to the possibility of human betterment, each of these advances involves hazards, the most ominous being their application to warfare. While considering the wonders and hazards of scientific advance, it is necessary to consider the less…
A social implications of computing course which teaches computer ethics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulliam, S.C.
1994-12-31
Computers are integral to today`s world, forming our society as well as responding to it, In recognition of this interaction, as well as in response to requirements by the Computer Science Accrediting Board (CSAB), many schools are incorporating computer ethics and values and addressing the social implications of computing within their curriculum. The approach discussed here is through a separate course, rather than relying on the integration of specific topics throughout the curriculum.
Building Real World Domain-Specific Social Network Websites as a Capstone Project
ERIC Educational Resources Information Center
Yue, Kwok-Bun; De Silva, Dilhar; Kim, Dan; Aktepe, Mirac; Nagle, Stewart; Boerger, Chris; Jain, Anubha; Verma, Sunny
2009-01-01
This paper describes our experience of using Content Management Software (CMS), specifically Joomla, to build a real world domain-specific social network site (SNS) as a capstone project for graduate information systems and computer science students. As Web 2.0 technologies become increasingly important in driving business application development,…
Special Section: New Ways to Detect Colon Cancer 3-D virtual screening now being used
... two together," recalls Arie Kaufman, chairman of the computer science department at New York's Stony Brook University. Dr. Kaufman is one of the world's leading researchers in the high-tech medical fields of biomedical visualization, computer graphics, virtual reality, and multimedia. The year was ...
How Computer-Assisted Teaching in Physics Can Enhance Student Learning
ERIC Educational Resources Information Center
Karamustafaoglu, O.
2012-01-01
Simple harmonic motion (SHM) is an important topic for physics or science students and has wide applications all over the world. Computer simulations are applications of special interest in physics teaching because they support powerful modeling environments involving physics concepts. This article is aimed to compare the effect of…
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.
This document presents witness testimony and supplemental materials from a Congressional hearing called to evaluate the progress of the High Performance Computing and Communications program in light of budget requests, to examine the appropriate role for the government in such a project, and to see demonstrations of the World Wide Web and related…
2000 FIRST Robotics Competition
NASA Technical Reports Server (NTRS)
Purman, Richard
2000-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2000 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
Hispanic women overcoming deterrents to computer science: A phenomenological study
NASA Astrophysics Data System (ADS)
Herling, Lourdes
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.
Quantum Steganography and Quantum Error-Correction
ERIC Educational Resources Information Center
Shaw, Bilal A.
2010-01-01
Quantum error-correcting codes have been the cornerstone of research in quantum information science (QIS) for more than a decade. Without their conception, quantum computers would be a footnote in the history of science. When researchers embraced the idea that we live in a world where the effects of a noisy environment cannot completely be…
Changing from computing grid to knowledge grid in life-science grid.
Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy
2009-09-01
Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.
Weighing the Balance of Science Literacy in Education and Public Policy
NASA Astrophysics Data System (ADS)
Buxner, S.; Impey, C.; Johnson, B.
2015-11-01
Science literacy is a concern of educators and policy makers in the United States and all over the world. Science literacy is defined by society and includes important knowledge for individuals that varies with culture and local knowledge systems. The technological societies of the western world have delegated the knowledge that underpins their everyday world to mechanics who know how their cars work, technicians who know how their computers work, and policy wonks who know how their individual choices and actions will affect the environment and their health. The scientific principles that frame and sculpt the technological world are invisible and mysterious to most people. A question for debate is whether or not this is a healthy situation or not, and if not, what to do about it. The panelists shared their prospects and challenges of building science literacy with individuals in the United States and with Tibetan monks. As they discussed their efforts working with these different populations, they shared lessons based on common issues and unique solutions based on local knowledge systems and communities of learners.
Game-Based Virtual Worlds as Decentralized Virtual Activity Systems
NASA Astrophysics Data System (ADS)
Scacchi, Walt
There is widespread interest in the development and use of decentralized systems and virtual world environments as possible new places for engaging in collaborative work activities. Similarly, there is widespread interest in stimulating new technological innovations that enable people to come together through social networking, file/media sharing, and networked multi-player computer game play. A decentralized virtual activity system (DVAS) is a networked computer supported work/play system whose elements and social activities can be both virtual and decentralized (Scacchi et al. 2008b). Massively multi-player online games (MMOGs) such as World of Warcraft and online virtual worlds such as Second Life are each popular examples of a DVAS. Furthermore, these systems are beginning to be used for research, deve-lopment, and education activities in different science, technology, and engineering domains (Bainbridge 2007, Bohannon et al. 2009; Rieber 2005; Scacchi and Adams 2007; Shaffer 2006), which are also of interest here. This chapter explores two case studies of DVASs developed at the University of California at Irvine that employ game-based virtual worlds to support collaborative work/play activities in different settings. The settings include those that model and simulate practical or imaginative physical worlds in different domains of science, technology, or engineering through alternative virtual worlds where players/workers engage in different kinds of quests or quest-like workflows (Jakobsson 2006).
NASA Astrophysics Data System (ADS)
Pedersen, Morten Gram
2018-03-01
Methods from network theory are increasingly used in research spanning from engineering and computer science to psychology and the social sciences. In this issue, Gosak et al. [1] provide a thorough review of network science applications to biological systems ranging from the subcellular world via neuroscience to ecosystems, with special attention to the insulin-secreting beta-cells in pancreatic islets.
A Corpus Investigation on the Journal of Social Sciences of the Turkic World
ERIC Educational Resources Information Center
Yilmaz, Isa
2018-01-01
In recent years, a rapid development in computer technologies has been witnessed and feasibility of data access has been increased. In today's world, restoring documents, or data in general, and transferring them to interested parties are ordinary tasks. The amount of restored documents has also increased expeditiously and this development has…
Global Collective Resources: A Study of Monographic Bibliographic Records in WorldCat.
ERIC Educational Resources Information Center
Perrault, Anna H.
In 2001, WorldCat, the primary international bibliographic utility, contained 45 million records with over 750 million library location listings. These records span over 4,000 years of recorded knowledge in 377 languages. Under the auspices of an OCLC/ALISE (Online Computer Library Center/Association of Library and Information Science Educators)…
Pupil Science Learning in Resource-Based e-Learning Environments
ERIC Educational Resources Information Center
So, Wing-mui Winnie; Ching, Ngai-ying Fiona
2011-01-01
With the rapid expansion of broadband Internet connection and availability of high performance yet low priced computers, many countries around the world are advocating the adoption of e-learning, the use of computer technology to improve learning and teaching. The trend of e-learning has urged many teachers to incorporate online resources in their…
Integrating Computational Thinking into Technology and Engineering Education
ERIC Educational Resources Information Center
Hacker, Michael
2018-01-01
Computational Thinking (CT) is being promoted as "a fundamental skill used by everyone in the world by the middle of the 21st Century" (Wing, 2006). CT has been effectively integrated into history, ELA, mathematics, art, and science courses (Settle, et al., 2012). However, there has been no analogous effort to integrate CT into…
ERIC Educational Resources Information Center
Marty, Paul F.
1999-01-01
Examines the sociotechnological impact of introducing advanced information technology into the Spurlock Museum, a museum of world history and culture at the University of Illinois. Addresses implementation of such methodologies as computer-supported cooperative work and computer-mediated communication in the museum environment. Emphasizes the…
Creation and Development of an Integrated Model of New Technologies and ESP
ERIC Educational Resources Information Center
Garcia Laborda, Jesus
2004-01-01
It seems irrefutable that the world is progressing in concert with computer science. Educational applications and projects for first and second language acquisition have not been left behind. However, currently it seems that the reputation of completely computer-based language learning courses has taken a nosedive, and, consequently there has been…
2 Internet-Savvy Students Help Track Down the Hacker of an NCAA Web Site.
ERIC Educational Resources Information Center
Wanat, Thomas
1997-01-01
A Duke University (North Carolina) student witnessing vandalism to the National Collegiate Athletic Association's (NCAA) World Wide Web site and a University of Massachusetts, Amherst student, both studying computer science, have contributed substantially to the identification of a computer hacker destroying the NCAA site. The students' rapid…
Ammann, Alexander
2016-01-01
"Digitality" (as opposed to "digitalization"--the conversion from the analog domain to the digital domain) will open up a whole new world that does not originate from the analog world. Contemporary research in the field of neural concepts and neuromorphic computing systems will lead to convergences between the world of digitality and the world of neuronality, giving the theme "Knowledge and Culture" a new meaning. The simulation of virtual multidimensional and contextual spaces will transform the transfer of knowledge from a uni- and bidirectional process into an interactive experience. We will learn to learn in a ubiquitous computing environment and will abandon conventional curriculum organization principles. The adaptation of individualized ontologies will result in the emergence of a new world of knowledge in which knowledge evolves from a cultural heritage into a commodity.
Trends in life science grid: from computing grid to knowledge grid.
Konagaya, Akihiko
2006-12-18
Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.
Trends in life science grid: from computing grid to knowledge grid
Konagaya, Akihiko
2006-01-01
Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294
Referees Often Miss Obvious Errors in Computer and Electronic Publications
NASA Astrophysics Data System (ADS)
de Gloucester, Paul Colin
2013-05-01
Misconduct is extensive and damaging. So-called science is prevalent. Articles resulting from so-called science are often cited in other publications. This can have damaging consequences for society and for science. The present work includes a scientometric study of 350 articles (published by the Association for Computing Machinery; Elsevier; The Institute of Electrical and Electronics Engineers, Inc.; John Wiley; Springer; Taylor & Francis; and World Scientific Publishing Co.). A lower bound of 85.4% articles are found to be incongruous. Authors cite inherently self-contradictory articles more than valid articles. Incorrect informational cascades ruin the literature's signal-to-noise ratio even for uncomplicated cases.
Referees often miss obvious errors in computer and electronic publications.
de Gloucester, Paul Colin
2013-01-01
Misconduct is extensive and damaging. So-called science is prevalent. Articles resulting from so-called science are often cited in other publications. This can have damaging consequences for society and for science. The present work includes a scientometric study of 350 articles (published by the Association for Computing Machinery; Elsevier; The Institute of Electrical and Electronics Engineers, Inc.; John Wiley; Springer; Taylor & Francis; and World Scientific Publishing Co.). A lower bound of 85.4% articles are found to be incongruous. Authors cite inherently self-contradictory articles more than valid articles. Incorrect informational cascades ruin the literature's signal-to-noise ratio even for uncomplicated cases.
Translations on USSR Science and Technology Physical Sciences and Technology No. 7
1977-02-28
cybernetics. [Answer] Immediately after the war , when the restoration of the national economy, which had been wrecked by the enemy, was started, Soviet...cyberneticization of economics and science will be developed at accelerated rates. 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND AUTOMATION TECHNOLOGY...working storage of the machine exceeds 64 thousand alpha-numeric characters. Communication with the external world is effected by means of a main
Solving Math and Science Problems in the Real World with a Computational Mind
ERIC Educational Resources Information Center
Olabe, Juan Carlos; Basogain, Xabier; Olabe, Miguel Ángel; Maíz, Inmaculada; Castaño, Carlos
2014-01-01
This article presents a new paradigm for the study of Math and Sciences curriculum during primary and secondary education. A workshop for Education undergraduates at four different campuses (n = 242) was designed to introduce participants to the new paradigm. In order to make a qualitative analysis of the current school methodologies in…
Parameter Networks: Towards a Theory of Low-level Vision,
1981-04-01
8217Iels suc(h ,-s thiose shown in 1ligure 7 to reorganize origami wo.d- figures. Figoure?7. 1’o show an example In detail, Kender’s techn!Ciue for...Compuiter Science Dept, Carnegie-.Mcllon U., October 1979. Kanade, Tl., "A theory of Origami world," CMU-CS-78-144, Computer Science Dept, Carnegie
A Living Library: New Model for Global Electronic Interactivity and Networking in the Garden.
ERIC Educational Resources Information Center
Sherk, Bonnie
1995-01-01
Describes the Living Library, an idea to create a network of international cultural parks in different cities of the world using new communications technologies on-line in a garden setting, bringing the humanities, sciences, and social sciences to life through plants, visual and performed artworks, lectures, and computer and on-line satellite…
Issues in undergraduate education in computational science and high performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchioro, T.L. II; Martin, D.
1994-12-31
The ever increasing need for mathematical and computational literacy within their society and among members of the work force has generated enormous pressure to revise and improve the teaching of related subjects throughout the curriculum, particularly at the undergraduate level. The Calculus Reform movement is perhaps the best known example of an organized initiative in this regard. The UCES (Undergraduate Computational Engineering and Science) project, an effort funded by the Department of Energy and administered through the Ames Laboratory, is sponsoring an informal and open discussion of the salient issues confronting efforts to improve and expand the teaching of computationalmore » science as a problem oriented, interdisciplinary approach to scientific investigation. Although the format is open, the authors hope to consider pertinent questions such as: (1) How can faculty and research scientists obtain the recognition necessary to further excellence in teaching the mathematical and computational sciences? (2) What sort of educational resources--both hardware and software--are needed to teach computational science at the undergraduate level? Are traditional procedural languages sufficient? Are PCs enough? Are massively parallel platforms needed? (3) How can electronic educational materials be distributed in an efficient way? Can they be made interactive in nature? How should such materials be tied to the World Wide Web and the growing ``Information Superhighway``?« less
New project to support scientific collaboration electronically
NASA Astrophysics Data System (ADS)
Clauer, C. R.; Rasmussen, C. E.; Niciejewski, R. J.; Killeen, T. L.; Kelly, J. D.; Zambre, Y.; Rosenberg, T. J.; Stauning, P.; Friis-Christensen, E.; Mende, S. B.; Weymouth, T. E.; Prakash, A.; McDaniel, S. E.; Olson, G. M.; Finholt, T. A.; Atkins, D. E.
A new multidisciplinary effort is linking research in the upper atmospheric and space, computer, and behavioral sciences to develop a prototype electronic environment for conducting team science worldwide. A real-world electronic collaboration testbed has been established to support scientific work centered around the experimental operations being conducted with instruments from the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland. Such group computing environments will become an important component of the National Information Infrastructure initiative, which is envisioned as the high-performance communications infrastructure to support national scientific research.
ERIC Educational Resources Information Center
Her Many Horses, Ian
2016-01-01
The world, and especially our own country, is in dire need of a larger and more diverse population of computer scientists. While many organizations have approached this problem of too few computer scientists in various ways, a promising, and I believe necessary, path is to expose elementary students to authentic practices of the discipline.…
A Long Range Science Rover For Future Mars Missions
NASA Technical Reports Server (NTRS)
Hayati, Samad
1997-01-01
This paper describes the design and implementation currently underway at the Jet Propulsion Laboratory of a long range science rover for future missions to Mars. The small rover prototype, called Rocky 7, is capable of long traverse. autonomous navigation. and science instrument control, carries three science instruments, and can be commanded from any computer platform and any location using the World Wide Web. In this paper we describe the mobility system, the sampling system, the sensor suite, navigation and control, onboard science instruments. and the ground command and control system.
New Horizons Regional Education Center 1999 FIRST Robotics Competition
NASA Technical Reports Server (NTRS)
Purman, Richard I.
1999-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 1999 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
New Horizons Regional Education Center 2001 FIRST Robotics Competition
NASA Technical Reports Server (NTRS)
2001-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2001 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
FIRST 2002, 2003, 2004 Robotics Competition(s)
NASA Technical Reports Server (NTRS)
Purman, Richard
2004-01-01
The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2002, 2003, and 2004 FIRST Robotics Competitions. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.
Translations on USSR Science and Technology Physical Sciences and Technology, Number 44
1978-08-10
COPYRIGHT: UkrNIINTI, 1978 8545 CSO: 1870 30 CYBERNETICS, COMPUTERS, AND AUTOMATION TECHNOLOGY SERIOUS PROBLEMS IN COORDINATING DEVELOPMENT OF...producer of the indispensable amino acid L-lysine. The first plant in the world for the production of a fodder concen- trate of lysine was built in...Sciences Faculty of the Univer- sity of Latvia. During the Great Patriotic War he was a radio operator and military correspondent for the front-line
ERIC Educational Resources Information Center
Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.
2012-01-01
As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…
ERIC Educational Resources Information Center
Cavus, Nadire
2008-01-01
Today, developments of information and communication technologies have been developing very fast all over the world. These new technologies were taking an important place in education like other sciences. For this reason, education was developing parallel to new developments on the new technologies. Departments which cover curriculum of new…
An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Little, M. M.
2013-12-01
NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.
The progress on time & frequency during the past 5 decades
NASA Astrophysics Data System (ADS)
Wang, Zheng-Ming
2002-06-01
The number and variety of applications using precise timing are astounding and increasing along with the new technology in communication, computer science, space science as well as in other fields. The world has evolved into the information age, and precise timing is at the heart of managing the flow of that information, which prompts the progress on precise timing itself rapidly. The development of time scales, UT1 determination, frequency standards, time transfer and the time dissemination for the past half century in the world and in China are described in this paper. The expectation in this field is discussed.
ERIC Educational Resources Information Center
Loughary, John W.
1977-01-01
Today's world is vastly technological, and counselors need to keep abreast of advances in computer science, biofeedback, and other technical systems. Counseling and technology from a larger perspective define technology as concepts and methods as well as hardware. (Author)
The NASA Science Internet: An integrated approach to networking
NASA Technical Reports Server (NTRS)
Rounds, Fred
1991-01-01
An integrated approach to building a networking infrastructure is an absolute necessity for meeting the multidisciplinary science networking requirements of the Office of Space Science and Applications (OSSA) science community. These networking requirements include communication connectivity between computational resources, databases, and library systems, as well as to other scientists and researchers around the world. A consolidated networking approach allows strategic use of the existing science networking within the Federal government, and it provides networking capability that takes into consideration national and international trends towards multivendor and multiprotocol service. It also offers a practical vehicle for optimizing costs and maximizing performance. Finally, and perhaps most important to the development of high speed computing is that an integrated network constitutes a focus for phasing to the National Research and Education Network (NREN). The NASA Science Internet (NSI) program, established in mid 1988, is structured to provide just such an integrated network. A description of the NSI is presented.
ERIC Educational Resources Information Center
Wilson, Courtney R.; Trautmann, Nancy M.; MaKinster, James G.; Barker, Barbara J.
2010-01-01
A new online tool called "Science Pipes" allows students to conduct biodiversity investigations. With this free tool, students create and run analyses that would otherwise require access to unwieldy data sets and the ability to write computer code. Using these data, students can conduct guided inquiries or hypothesis-driven research to…
Adams, Peter; Goos, Merrilyn
2010-01-01
Modern biological sciences require practitioners to have increasing levels of knowledge, competence, and skills in mathematics and programming. A recent review of the science curriculum at the University of Queensland, a large, research-intensive institution in Australia, resulted in the development of a more quantitatively rigorous undergraduate program. Inspired by the National Research Council's BIO2010 report, a new interdisciplinary first-year course (SCIE1000) was created, incorporating mathematics and computer programming in the context of modern science. In this study, the perceptions of biological science students enrolled in SCIE1000 in 2008 and 2009 are measured. Analysis indicates that, as a result of taking SCIE1000, biological science students gained a positive appreciation of the importance of mathematics in their discipline. However, the data revealed that SCIE1000 did not contribute positively to gains in appreciation for computing and only slightly influenced students' motivation to enroll in upper-level quantitative-based courses. Further comparisons between 2008 and 2009 demonstrated the positive effect of using genuine, real-world contexts to enhance student perceptions toward the relevance of mathematics. The results support the recommendation from BIO2010 that mathematics should be introduced to biology students in first-year courses using real-world examples, while challenging the benefits of introducing programming in first-year courses. PMID:20810961
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Martin, D.; Drugan, C.
2010-11-23
This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less
XXV IUPAP Conference on Computational Physics (CCP2013): Preface
NASA Astrophysics Data System (ADS)
2014-05-01
XXV IUPAP Conference on Computational Physics (CCP2013) was held from 20-24 August 2013 at the Russian Academy of Sciences in Moscow, Russia. The annual Conferences on Computational Physics (CCP) present an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas. The CCP series aims to draw computational scientists from around the world and to stimulate interdisciplinary discussion and collaboration by putting together researchers interested in various fields of computational science. It is organized under the auspices of the International Union of Pure and Applied Physics and has been in existence since 1989. The CCP series alternates between Europe, America and Asia-Pacific. The conferences are traditionally supported by European Physical Society and American Physical Society. This year the Conference host was Landau Institute for Theoretical Physics. The Conference contained 142 presentations, and, in particular, 11 plenary talks with comprehensive reviews from airbursts to many-electron systems. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), European Physical Society (EPS), Division of Computational Physics of American Physical Society (DCOMP/APS), Russian Foundation for Basic Research, Department of Physical Sciences of Russian Academy of Sciences, RSC Group company. Further conference information and images from the conference are available in the pdf.
NASA Astrophysics Data System (ADS)
Podrasky, A.; Covitt, B. A.; Woessner, W.
2017-12-01
The availability of clean water to support human uses and ecological integrity has become an urgent interest for many scientists, decision makers and citizens. Likewise, as computational capabilities increasingly revolutionize and become integral to the practice of science, technology, engineering and math (STEM) disciplines, the STEM+ Computing (STEM+C) Partnerships program seeks to integrate the use of computational approaches in K-12 STEM teaching and learning. The Comp Hydro project, funded by a STEM+C grant from the National Science Foundation, brings together a diverse team of scientists, educators, professionals and citizens at sites in Arizona, Colorado, Maryland and Montana to foster water literacy, as well as computational science literacy, by integrating authentic, place- and data- based learning using physical, mathematical, computational and conceptual models. This multi-state project is currently engaging four teams of six teachers who work during two academic years with educators and scientists at each site. Teams work to develop instructional units specific to their region that integrate hydrologic science and computational modeling. The units, currently being piloted in high school earth and environmental science classes, provide a classroom context to investigate student understanding of how computation is used in Earth systems science. To develop effective science instruction that is rich in place- and data- based learning, effective collaborations between researchers, educators, scientists, professionals and citizens are crucial. In this poster, we focus on project implementation in Montana, where an instructional unit has been developed and is being tested through collaboration among University scientists, researchers and educators, high school teachers and agency and industry scientists and engineers. In particular, we discuss three characteristics of effective collaborative science education design for developing and implementing place- and data- based science education to support students in developing socio-scientific and computational literacy sufficient for making decisions about real world issues such as groundwater contamination. These characteristics include that science education experiences are real, responsive/accessible and rigorous.
Artificial-life researchers try to create social reality.
Flam, F
1994-08-12
Some scientists, among them cosmologist Stephen Hawking, argue that computer viruses are alive. A better case might be made for many of the self-replicating silicon-based creatures featured at the fourth Conference on Artificial Life, held on 5 to 8 July in Boston. Researchers from computer science, biology, and other disciplines presented computer programs that, among other things, evolved cooperative strategies in a selfish world and recreated themselves in ever more complex forms.
Earth Science Informatics - Overview
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2015-01-01
Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes nearly 150 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies. Remote Sensing; Earth Science Informatics, Data Systems; Data Services; Metadata
Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games
ERIC Educational Resources Information Center
Nilsson, Elisabet M.; Jakobsson, Anders
2011-01-01
The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is…
R&D100 Finalist: Neuromorphic Cyber Microscope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Follett, David; Naegle, John; Suppona, Roger
The Neuromorphic Cyber Microscope provides security analysts with unprecedented visibility of their network, computer and storage assets. This processor is the world's first practical implementation of neuromorphic technology to a major computer science mission. Working with Lewis Rhodes Labs, engineers at Sandia National Laboratories have created a device that is orders of magnitude faster at analyzing data to identify cyber-attacks.
Technology Needs for Teachers Web Development and Curriculum Adaptations
NASA Technical Reports Server (NTRS)
Carroll, Christy J.
1999-01-01
Computer-based mathematics and science curricula focusing on NASA inventions and technologies will enhance current teacher knowledge and skills. Materials and interactive software developed by educators will allow students to integrate their various courses, to work cooperatively, and to collaborate with both NASA scientists and students at other locations by using computer networks, email and the World Wide Web.
Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA
Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.
2017-01-01
The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099
Earth Science Informatics - Overview
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2017-01-01
Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes over 180 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies.
Earth Science Informatics - Overview
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2017-01-01
Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes over 180 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies.The talk will present an overview of current efforts in ESI, the role members of IEEE GRSS play, and discuss recent developments in data preservation and provenance.
Visions of the Future - the Changing Role of Actors in Data-Intensive Science
NASA Astrophysics Data System (ADS)
Schäfer, L.; Klump, J. F.
2013-12-01
Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertsch, Adam; Draeger, Erik; Richards, David
2017-01-12
With Sequoia at Lawrence Livermore National Laboratory, researchers explore grand challenging problems and are generating results at scales never before achieved. Sequoia is the first computer to have more than one million processors and is one of the fastest supercomputers in the world.
In Praise of Numerical Computation
NASA Astrophysics Data System (ADS)
Yap, Chee K.
Theoretical Computer Science has developed an almost exclusively discrete/algebraic persona. We have effectively shut ourselves off from half of the world of computing: a host of problems in Computational Science & Engineering (CS&E) are defined on the continuum, and, for them, the discrete viewpoint is inadequate. The computational techniques in such problems are well-known to numerical analysis and applied mathematics, but are rarely discussed in theoretical algorithms: iteration, subdivision and approximation. By various case studies, I will indicate how our discrete/algebraic view of computing has many shortcomings in CS&E. We want embrace the continuous/analytic view, but in a new synthesis with the discrete/algebraic view. I will suggest a pathway, by way of an exact numerical model of computation, that allows us to incorporate iteration and approximation into our algorithms’ design. Some recent results give a peek into how this view of algorithmic development might look like, and its distinctive form suggests the name “numerical computational geometry” for such activities.
The International Symposium on Grids and Clouds
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.
NASA Astrophysics Data System (ADS)
Moore, S. L.; Kar, A.; Gomez, R.
2015-12-01
A partnership between Fort Valley State University (FVSU), the Jackson School of Geosciences at The University of Texas (UT) at Austin, and the Texas Advanced Computing Center (TACC) is engaging computational geoscience faculty and researchers with academically talented underrepresented minority (URM) students, training them to solve grand challenges . These next generation computational geoscientists are being trained to solve some of the world's most challenging geoscience grand challenges requiring data intensive large scale modeling and simulation on high performance computers . UT Austin's geoscience outreach program GeoFORCE, recently awarded the Presidential Award in Excellence in Science, Mathematics and Engineering Mentoring, contributes to the collaborative best practices in engaging researchers with URM students. Collaborative efforts over the past decade are providing data demonstrating that integrative pipeline programs with mentoring and paid internship opportunities, multi-year scholarships, computational training, and communication skills development are having an impact on URMs developing middle skills for geoscience careers. Since 1997, the Cooperative Developmental Energy Program at FVSU and its collaborating universities have graduated 87 engineers, 33 geoscientists, and eight health physicists. Recruited as early as high school, students enroll for three years at FVSU majoring in mathematics, chemistry or biology, and then transfer to UT Austin or other partner institutions to complete a second STEM degree, including geosciences. A partnership with the Integrative Computational Education and Research Traineeship (ICERT), a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at TACC provides students with a 10-week summer research experience at UT Austin. Mentored by TACC researchers, students with no previous background in computational science learn to use some of the world's most powerful high performance computing resources to address a grand geosciences problem. Students increase their ability to understand and explain the societal impact of their research and communicate the research to multidisciplinary and lay audiences via near-peer mentoring, poster presentations, and publication opportunities.
Accelerating scientific discovery : 2007 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Dave, P.; Drugan, C.
2008-11-14
As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less
NASA Astrophysics Data System (ADS)
Wang, Jianxiong
2014-06-01
This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF
Cybersecurity Education for Military Officers
2017-12-01
lecture showed the math behind the possible combinations of passwords of different lengths, and made the recommendation to increase your password to...2. Math the system to the real world: Use of effective metaphors and real world language wherever possible. 3. User Control: Try to give the user...given any training on this topic outside of annual NKO courses. I was a math major for my undergraduate degree, so I have no computer science
Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!
NASA Astrophysics Data System (ADS)
Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.
2015-04-01
Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.
Agile science: creating useful products for behavior change in the real world.
Hekler, Eric B; Klasnja, Predrag; Riley, William T; Buman, Matthew P; Huberty, Jennifer; Rivera, Daniel E; Martin, Cesar A
2016-06-01
Evidence-based practice is important for behavioral interventions but there is debate on how best to support real-world behavior change. The purpose of this paper is to define products and a preliminary process for efficiently and adaptively creating and curating a knowledge base for behavior change for real-world implementation. We look to evidence-based practice suggestions and draw parallels to software development. We argue to target three products: (1) the smallest, meaningful, self-contained, and repurposable behavior change modules of an intervention; (2) "computational models" that define the interaction between modules, individuals, and context; and (3) "personalization" algorithms, which are decision rules for intervention adaptation. The "agile science" process includes a generation phase whereby contender operational definitions and constructs of the three products are created and assessed for feasibility and an evaluation phase, whereby effect size estimates/casual inferences are created. The process emphasizes early-and-often sharing. If correct, agile science could enable a more robust knowledge base for behavior change.
NASA Astrophysics Data System (ADS)
Mezzacappa, Anthony
2005-01-01
On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.
An Extensible NetLogo Model for Visualizing Message Routing Protocols
2017-08-01
the hard sciences to the social sciences to computer-generated art. NetLogo represents the world as a set of...describe the model is shown here; for the supporting methods , refer to the source code. Approved for public release; distribution is unlimited. 4 iv...if ticks - last-inject > time-to-inject [inject] if run# > #runs [stop] end Next, we present some basic statistics collected for the
Computer and Internet use by home care and hospice agencies.
Long, C O; Greenberg, E A; Ismeurt, R L; Smith, G
2000-01-01
Nurses in home healthcare and hospice are embracing the advances in computer science and technology to provide an edge in administration and clinical practice. Of concern to nurse managers is the extent to which personal computers and the Internet have been used in home healthcare and hospice, and what information, opportunities, and needs related to education are on the horizon. This article discusses the results of a national survey conducted exclusively on the World Wide Web to answer these questions.
The study of early human embryos using interactive 3-dimensional computer reconstructions.
Scarborough, J; Aiton, J F; McLachlan, J C; Smart, S D; Whiten, S C
1997-07-01
Tracings of serial histological sections from 4 human embryos at different Carnegie stages were used to create 3-dimensional (3D) computer models of the developing heart. The models were constructed using commercially available software developed for graphic design and the production of computer generated virtual reality environments. They are available as interactive objects which can be downloaded via the World Wide Web. This simple method of 3D reconstruction offers significant advantages for understanding important events in morphological sciences.
ERIC Educational Resources Information Center
Thomson, Norman; Chapman, Seri
2004-01-01
The Virtual Gorilla Modeling Project--a professional development project--is a collaboration of middle and high school inservice teachers, Zoo Atlanta primatologists, science and computer educators, and students. During a 10-day professional development summer workshop, middle and high school teachers explore the world of the gorilla through…
Fields, Chris
2015-12-01
Does perception hide the truth? Information theory, computer science, and quantum theory all suggest that the answer is "yes." They suggest, indeed, that useful perception is only feasible because the truth can be hidden.
Maintaining Privacy in Pervasive Computing - Enabling Acceptance of Sensor-based Services
NASA Astrophysics Data System (ADS)
Soppera, A.; Burbridge, T.
During the 1980s, Mark Weiser [1] predicted a world in which computing was so pervasive that devices embedded in the environment could sense their relationship to us and to each other. These tiny ubiquitous devices would continually feed information from the physical world into the information world. Twenty years ago, this vision was the exclusive territory of academic computer scientists and science fiction writers. Today this subject has become of interest to business, government, and society. Governmental authorities exercise their power through the networked environment. Credit card databases maintain our credit history and decide whether we are allowed to rent a house or obtain a loan. Mobile telephones can locate us in real time so that we do not miss calls. Within another 10 years, all sorts of devices will be connected through the network. Our fridge, our food, together with our health information, may all be networked for the purpose of maintaining diet and well-being. The Internet will move from being an infrastructure to connect computers, to being an infrastructure to connect everything [2, 3].
Computational Physics in a Nutshell
NASA Astrophysics Data System (ADS)
Schillaci, Michael
2001-11-01
Too often students of science are expected to ``pick-up'' what they need to know about the Art of Science. A description of the two-semester Computational Physics course being taught by the author offers a remedy to this situation. The course teaches students the three pillars of modern scientific research: Problem Solving, Programming, and Presentation. Using FORTRAN, LaTeXe, MAPLE V, HTML, and JAVA, students learn the fundamentals of algorithm development, how to implement classes and packages written by others, how to produce publication quality graphics and documents and how to publish them on the world-wide-web. The course content is outlined and project examples are offered.
Critical thinking traits of top-tier experts and implications for computer science education
NASA Astrophysics Data System (ADS)
Bushey, Dean E.
A documented shortage of technical leadership and top-tier performers in computer science jeopardizes the technological edge, security, and economic well-being of the nation. The 2005 President's Information and Technology Advisory Committee (PITAC) Report on competitiveness in computational sciences highlights the major impact of science, technology, and innovation in keeping America competitive in the global marketplace. It stresses the fact that the supply of science, technology, and engineering experts is at the core of America's technological edge, national competitiveness and security. However, recent data shows that both undergraduate and postgraduate production of computer scientists is falling. The decline is "a quiet crisis building in the United States," a crisis that, if allowed to continue unchecked, could endanger America's well-being and preeminence among the world's nations. Past research on expert performance has shown that the cognitive traits of critical thinking, creativity, and problem solving possessed by top-tier performers can be identified, observed and measured. The studies show that the identified attributes are applicable across many domains and disciplines. Companies have begun to realize that cognitive skills are important for high-level performance and are reevaluating the traditional academic standards they have used to predict success for their top-tier performers in computer science. Previous research in the computer science field has focused either on programming skills of its experts or has attempted to predict the academic success of students at the undergraduate level. This study, on the other hand, examines the critical-thinking skills found among experts in the computer science field in order to explore the questions, "What cognitive skills do outstanding performers possess that make them successful?" and "How do currently used measures of academic performance correlate to critical-thinking skills among students?" The results of this study suggest a need to examine how critical-thinking abilities are learned in the undergraduate computer science curriculum and the need to foster these abilities in order to produce the high-level, critical-thinking professionals necessary to fill the growing need for these experts. Due to the fact that current measures of academic performance do not adequately depict students' cognitive abilities, assessment of these skills must be incorporated into existing curricula.
Leveraging e-Science infrastructure for electrochemical research.
Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F
2011-08-28
As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.
A Visit to the Computer Science Department,
1983-01-11
very small, its capabilities are not. Let’s take the F.8 micro-computer and compare it to the world’s first computer, " Eniac ", for a minute. Eniac was...than 30 tons, and filled up completely a room of 170 square meters. The F8 micro-computer on the other hand, has a volume 1/30,000th of Eniac , weighs...less than half a *kilo, has a power uptake of only 2.5 watts, but is 20 times as fast as Eniac and more than 10,000 times as reliable. "From this we can
Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.
2014-12-01
The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.
Teaching Hypertext and Hypermedia through the Web.
ERIC Educational Resources Information Center
de Bra, Paul M. E.
This paper describes a World Wide Web-based introductory course titled "Hypermedia Structures and Systems," offered as an optional part of the curriculum in computing science at the Eindhoven University of Technology (Netherlands). The technical environment for the current (1996) edition of the course is presented, which features…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Hack, James; Riley, Katherine
The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less
Working Towards New Transformative Geoscience Analytics Enabled by Petascale Computing
NASA Astrophysics Data System (ADS)
Woodcock, R.; Wyborn, L.
2012-04-01
Currently the top 10 supercomputers in the world are petascale and already exascale computers are being planned. Cloud computing facilities are becoming mainstream either as private or commercial investments. These computational developments will provide abundant opportunities for the earth science community to tackle the data deluge which has resulted from new instrumentation enabling data to be gathered at a greater rate and at higher resolution. Combined, the new computational environments should enable the earth sciences to be transformed. However, experience in Australia and elsewhere has shown that it is not easy to scale existing earth science methods, software and analytics to take advantage of the increased computational capacity that is now available. It is not simply a matter of 'transferring' current work practices to the new facilities: they have to be extensively 'transformed'. In particular new Geoscientific methods will need to be developed using advanced data mining, assimilation, machine learning and integration algorithms. Software will have to be capable of operating in highly parallelised environments, and will also need to be able to scale as the compute systems grow. Data access will have to improve and the earth science community needs to move from the file discovery, display and then locally download paradigm to self describing data cubes and data arrays that are available as online resources from either major data repositories or in the cloud. In the new transformed world, rather than analysing satellite data scene by scene, sensor agnostic data cubes of calibrated earth observation data will enable researchers to move across data from multiple sensors at varying spatial data resolutions. In using geophysics to characterise basement and cover, rather than analysing individual gridded airborne geophysical data sets, and then combining the results, petascale computing will enable analysis of multiple data types, collected at varying resolutions with integration and validation across data type boundaries. Increased capacity of storage and compute will mean that uncertainty and reliability of individual observations will consistently be taken into account and propagated throughout the processing chain. If these data access difficulties can be overcome, the increased compute capacity will also mean that larger scale, more complex models can be run at higher resolution and instead of single pass modelling runs. Ensembles of models will be able to be run to simultaneously test multiple hypotheses. Petascale computing and high performance data offer more than "bigger, faster": it is an opportunity for a transformative change in the way in which geoscience research is routinely conducted.
PREFACE: High Performance Computing Symposium 2011
NASA Astrophysics Data System (ADS)
Talon, Suzanne; Mousseau, Normand; Peslherbe, Gilles; Bertrand, François; Gauthier, Pierre; Kadem, Lyes; Moitessier, Nicolas; Rouleau, Guy; Wittig, Rod
2012-02-01
HPCS (High Performance Computing Symposium) is a multidisciplinary conference that focuses on research involving High Performance Computing and its application. Attended by Canadian and international experts and renowned researchers in the sciences, all areas of engineering, the applied sciences, medicine and life sciences, mathematics, the humanities and social sciences, it is Canada's pre-eminent forum for HPC. The 25th edition was held in Montréal, at the Université du Québec à Montréal, from 15-17 June and focused on HPC in Medical Science. The conference was preceded by tutorials held at Concordia University, where 56 participants learned about HPC best practices, GPU computing, parallel computing, debugging and a number of high-level languages. 274 participants from six countries attended the main conference, which involved 11 invited and 37 contributed oral presentations, 33 posters, and an exhibit hall with 16 booths from our sponsors. The work that follows is a collection of papers presented at the conference covering HPC topics ranging from computer science to bioinformatics. They are divided here into four sections: HPC in Engineering, Physics and Materials Science, HPC in Medical Science, HPC Enabling to Explore our World and New Algorithms for HPC. We would once more like to thank the participants and invited speakers, the members of the Scientific Committee, the referees who spent time reviewing the papers and our invaluable sponsors. To hear the invited talks and learn about 25 years of HPC development in Canada visit the Symposium website: http://2011.hpcs.ca/lang/en/conference/keynote-speakers/ Enjoy the excellent papers that follow, and we look forward to seeing you in Vancouver for HPCS 2012! Gilles Peslherbe Chair of the Scientific Committee Normand Mousseau Co-Chair of HPCS 2011 Suzanne Talon Chair of the Organizing Committee UQAM Sponsors The PDF also contains photographs from the conference banquet.
A Wittgenstein Approach to the Learning of OO-Modeling
ERIC Educational Resources Information Center
Holmboe, Christian
2004-01-01
The paper uses Ludwig Wittgenstein's theories about the relationship between thought, language, and objects of the world to explore the assumption that OO-thinking resembles natural thinking. The paper imports from research in linguistic philosophy to computer science education research. I show how UML class diagrams (i.e., an artificial…
Air Force Laboratory’s 2005 Technology Milestones
2006-01-01
Computational materials science methods can benefit the design and property prediction of complex real-world materials. With these models , scientists and...Warfighter Page Air High - Frequency Acoustic System...800) 203-6451 High - Frequency Acoustic System Payoff Scientists created the High - Frequency Acoustic Suppression Technology (HiFAST) airflow control
This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...
CyberArts: Exploring Art and Technology.
ERIC Educational Resources Information Center
Jacobson, Linda, Ed.
This book takes the position that CyberArts(TM) is the new frontier in creativity, where the worlds of science and art meet. Computer technologies, visual design, music and sound, education and entertainment merge to form the new artistic territory of interactive multimedia. This diverse collection of essays, articles, and commentaries…
Mobile Technology Integrated Pedagogical Model
ERIC Educational Resources Information Center
Khan, Arshia
2014-01-01
Integrated curricula and experiential learning are the main ingredients to the recipe to improve student learning in higher education. In the academic computer science world it is mostly assumed that this experiential learning takes place at a business as an internship experience. The intent of this paper is to schism the traditional understanding…
Teaching and Learning in the Mixed-Reality Science Classroom
ERIC Educational Resources Information Center
Tolentino, Lisa; Birchfield, David; Megowan-Romanowicz, Colleen; Johnson-Glenberg, Mina C.; Kelliher, Aisling; Martinez, Christopher
2009-01-01
As emerging technologies become increasingly inexpensive and robust, there is an exciting opportunity to move beyond general purpose computing platforms to realize a new generation of K-12 technology-based learning environments. Mixed-reality technologies integrate real world components with interactive digital media to offer new potential to…
career at NREL in 1995 by conducting scanning tunneling microscope (STM) studies of the atomic structure revealed a new strain-induced step structure and contributed to the development of world-record-efficiency NREL's Computational Materials Science team, probing the atomic structure of dislocations in III-V
Semantics vs. World Knowledge in Prefrontal Cortex
ERIC Educational Resources Information Center
Pylkkanen, Liina; Oliveri, Bridget; Smart, Andrew J.
2009-01-01
Humans have knowledge about the properties of their native language at various levels of representation; sound, structure, and meaning computation constitute the core components of any linguistic theory. Although the brain sciences have engaged with representational theories of sound and syntactic structure, the study of the neural bases of…
Asking Research Questions: Theoretical Presuppositions
ERIC Educational Resources Information Center
Tenenberg, Josh
2014-01-01
Asking significant research questions is a crucial aspect of building a research foundation in computer science (CS) education. In this article, I argue that the questions that we ask are shaped by internalized theoretical presuppositions about how the social and behavioral worlds operate. And although such presuppositions are essential in making…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Ann E; Barker, Ashley D; Bland, Arthur S Buddy
Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of thesemore » we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation billions of gallons of fuel.« less
Grids for Dummies: Featuring Earth Science Data Mining Application
NASA Technical Reports Server (NTRS)
Hinke, Thomas H.
2002-01-01
This viewgraph presentation discusses the concept and advantages of linking computers together into data grids, an emerging technology for managing information across institutions, and potential users of data grids. The logistics of access to a grid, including the use of the World Wide Web to access grids, and security concerns are also discussed. The potential usefulness of data grids to the earth science community is also discussed, as well as the Global Grid Forum, and other efforts to establish standards for data grids.
NASA Technical Reports Server (NTRS)
1996-01-01
This paper presents the summaries of the MCTP Summer Research Internship Program. Technological areas discussed include: Mathematical curriculum development for real world problems; Rain effects on air-water gas exchange; multi-ring impact basins on mars; developing an interactive multimedia educational cd-rom on remote sensing; a pilot of an activity for for the globe program; fossils in maryland; developing children's programming for the american horticultural society at river farm; children's learning, educational programs of the national park service; a study of climate and student satisfaction in two summer programs for disadvantaged students interested in careers in mathematics and science; the maryland governor's academy, integrating technology into the classroom; stream sampling with the maryland biological stream survey (MBSS); the imaging system inspection software technology, the preparation and detection of nominal and faulted steel ingots; event-based science, the development of real-world science units; correlation between anxiety and past experiences; environmental education through summer nature camp; enhancing learning opportunities at the Salisbury zoo; plant growth experiment, a module for the middle school classroom; the effects of proxisome proliferators in Japanese medaka embryos; development of a chapter on birth control and contraceptive methodologies as part of an interactive computer-based education module on hiv and aids; excretion of gentamicin in toadfish and goldfish; the renaissance summer program; and Are field trips important to the regional math science center?
ChemPreview: an augmented reality-based molecular interface.
Zheng, Min; Waller, Mark P
2017-05-01
Human computer interfaces make computational science more comprehensible and impactful. Complex 3D structures such as proteins or DNA are magnified by digital representations and displayed on two-dimensional monitors. Augmented reality has recently opened another door to access the virtual three-dimensional world. Herein, we present an augmented reality application called ChemPreview with the potential to manipulate bio-molecular structures at an atomistic level. ChemPreview is available at https://github.com/wallerlab/chem-preview/releases, and is built on top of the Meta 1 platform https://www.metavision.com/. ChemPreview can be used to interact with a protein in an intuitive way using natural hand gestures, thereby making it appealing to computational chemists or structural biologists. The ability to manipulate atoms in real world could eventually provide new and more efficient ways of extracting structural knowledge, or designing new molecules in silico. Copyright © 2017 Elsevier Inc. All rights reserved.
Safdari, Reza; Shahmoradi, Leila; Hosseini-Beheshti, Molouk-Sadat; Nejad, Ahmadreza Farzaneh; Hosseiniravandi, Mohammad
2015-10-01
Encyclopedias and their compilation have become so prevalent as a valid cultural medium in the world. The daily development of computer industry and the expansion of various sciences have made indispensable the compilation of electronic, specialized encyclopedias, especially the web-based ones. This is an applied-developmental study conducted in 2014. First, the main terms in the field of medical informatics were gathered using MeSH Online 2014 and the supplementary terms of each were determined, and then the tree diagram of the terms was drawn based on their relationship in MeSH. Based on the studies done by the researchers, the tree diagram of the encyclopedia was drawn with respect to the existing areas in this field, and the terms gathered were put in related domains. In MeSH, 75 preferred terms together with 249 supplementary ones were indexed. One of the informatics' sub-branches is biomedical informatics and health which itself consists of three sub-divisions of bioinformatics, clinical informatics, and health informatics. Medical informatics which is a subdivision of clinical informatics has developed from the three fields of medical sciences, management and social sciences, and computational sciences and mathematics. Medical Informatics is created of confluence and fusion and applications of the three major scientific branches include health and biological sciences, social sciences and management sciences, computing and mathematical sciences, and according to that the structure of MeSH is weak for future development of Encyclopedia of Medical Informatics.
NASA Technical Reports Server (NTRS)
Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia
2006-01-01
The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.
ISMB 2016 offers outstanding science, networking, and celebration
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas. PMID:27347392
ISMB 2016 offers outstanding science, networking, and celebration.
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bland, Arthur S Buddy; Hack, James J; Baker, Ann E
Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next-generation systems.« less
ERIC Educational Resources Information Center
Hmeljak, Dimitrij
2010-01-01
Virtual worlds provide useful platforms for social behavioral research, but impose stringent limitations on the rules of engagement, responsiveness, and data collection, along with other resource restrictions. The major challenge from a computer science standpoint in developing group behavior applications for such environments is accommodating the…
Technology assessment of advanced automation for space missions
NASA Technical Reports Server (NTRS)
1982-01-01
Six general classes of technology requirements derived during the mission definition phase of the study were identified as having maximum importance and urgency, including autonomous world model based information systems, learning and hypothesis formation, natural language and other man-machine communication, space manufacturing, teleoperators and robot systems, and computer science and technology.
NCSTRL: Design and Deployment of a Globally Distributed Digital Library.
ERIC Educational Resources Information Center
Davies, James R.; Lagoze, Carl
2000-01-01
Discusses the development of a digital library architecture that allows the creation of digital libraries within the World Wide Web. Describes a digital library, NCSTRL (Networked Computer Science Technical Research Library), within which the work has taken place and explains Dienst, a protocol and architecture for distributed digital libraries.…
Interdisciplinary Project Experiences: Collaboration between Majors and Non-Majors
ERIC Educational Resources Information Center
Smarkusky, Debra L.; Toman, Sharon A.
2014-01-01
Students in computer science and information technology should be engaged in solving real-world problems received from government and industry as well as those that expose them to various areas of application. In this paper, we discuss interdisciplinary project experiences between majors and non-majors that offered a creative and innovative…
Fermilab | Science | Questions for the Universe | The Particle World | Why
effects observed so far are insufficient to explain this predominance. The current program of experiments suggest significant effects in the bound state with the strange quark, Bs. Physicists at the Tevatron made . Lattice Computational Facilities offer great promise for the calculation of the effects of the strong
ERIC Educational Resources Information Center
Vellom, Paul; Fetters, Marcia; Beeth, Michael
Since technology use in schools has been increasing, teachers want to maximize its use in their classroom to increase student learning. Therefore, accreditation requirements for colleges include integrating computer and information technology with teacher education and professional development programs. This paper describes different models for…
Earth Science Learning in SMALLab: A Design Experiment for Mixed Reality
ERIC Educational Resources Information Center
Birchfield, David; Megowan-Romanowicz, Colleen
2009-01-01
Conversational technologies such as email, chat rooms, and blogs have made the transition from novel communication technologies to powerful tools for learning. Currently virtual worlds are undergoing the same transition. We argue that the next wave of innovation is at the level of the computer interface, and that mixed-reality environments offer…
NASA Astrophysics Data System (ADS)
Shimazu, Nobuko
In an increasingly globalized world, demand for engineers well versed in English remains strong. As a professor of English in the Faculty of Computer Science and Systems Engineering at the Kyushu Institute of Technology, I have sought with the aid of two associate professors to improve the English program for our engineering students together to help meet that very demand. In order to assist other English teachers in similar situations to improve their own English programs, I would like to report on the ideas and methods presently used in our undergraduate English program, specifically the first-year compulsory and common course with its emphasis on paragraph writing which students from each of the five departments within the Faculty of Computer Science and Systems Engineering are required to take. In addition, I would also like to report my ideas and teaching methods for a graduate research paper writing course. The objective of this course is to teach graduate students how to write presentations for conferences and papers for journals at the international level.
Distance education through the Internet: the GNA-VSNS biocomputing course.
de la Vega, F M; Giegerich, R; Fuellen, G
1996-01-01
A prototype course on biocomputing was delivered via international computer networks in early summer 1995. The course lasted 11 weeks, and was offered free of charge. It was organized by the BioComputing Division of the Virtual School of Natural Sciences, which is a member school of the Globewide Network Academy. It brought together 34 students and 7 instructors from all over the world, and covered the basics of sequence analysis. Five authors from Germany and USA prepared a hypertext book which was discussed in weekly study sessions that took place in a virtual classroom at the BioMOO electronic conferencing system. The course aimed at students with backgrounds in molecular biology, biomedicine or computer science, complementing and extending their skills with an interdisciplinary curriculum. Special emphasis was placed on the use of Internet resources, and the development of new teaching tools. The hypertext book includes direct links to sequence analysis and databank search services on the Internet. A tool for the interactive visualization of unit-cost pairwise sequence alignment was developed for the course. All course material will stay accessible at the World Wide Web address (Uniform Resource Locator) http://+www.techfak.uni-bielefeld.de/bcd/welcome .html. This paper describes the aims and organization of the course, and gives a preliminary account of this novel experience in distance education.
A History of High-Performance Computing
NASA Technical Reports Server (NTRS)
2006-01-01
Faster than most speedy computers. More powerful than its NASA data-processing predecessors. Able to leap large, mission-related computational problems in a single bound. Clearly, it s neither a bird nor a plane, nor does it need to don a red cape, because it s super in its own way. It's Columbia, NASA s newest supercomputer and one of the world s most powerful production/processing units. Named Columbia to honor the STS-107 Space Shuttle Columbia crewmembers, the new supercomputer is making it possible for NASA to achieve breakthroughs in science and engineering, fulfilling the Agency s missions, and, ultimately, the Vision for Space Exploration. Shortly after being built in 2004, Columbia achieved a benchmark rating of 51.9 teraflop/s on 10,240 processors, making it the world s fastest operational computer at the time of completion. Putting this speed into perspective, 20 years ago, the most powerful computer at NASA s Ames Research Center, home of the NASA Advanced Supercomputing Division (NAS), ran at a speed of about 1 gigaflop (one billion calculations per second). The Columbia supercomputer is 50,000 times faster than this computer and offers a tenfold increase in capacity over the prior system housed at Ames. What s more, Columbia is considered the world s largest Linux-based, shared-memory system. The system is offering immeasurable benefits to society and is the zenith of years of NASA/private industry collaboration that has spawned new generations of commercial, high-speed computing systems.
Closing the race and gender gaps in computer science education
NASA Astrophysics Data System (ADS)
Robinson, John Henry
Life in a technological society brings new paradigms and pressures to bear on education. These pressures are magnified for underrepresented students and must be addressed if they are to play a vital part in society. Educational pipelines need to be established to provide at risk students with the means and opportunity to succeed in science, technology, engineering, and mathematics (STEM) majors. STEM educational pipelines are programs consisting of components that seek to facilitate students' completion of a college degree by providing access to higher education, intervention, mentoring, support infrastructure, and programs that encourage academic success. Successes in the STEM professions mean that more educators, scientist, engineers, and researchers will be available to add diversity to the professions and to provide role models for future generations. The issues that the educational pipelines must address are improving at risk groups' perceptions and awareness of the math, science, and engineering professions. Additionally, the educational pipelines must provide intervention in math preparation, overcome gender and race socialization, and provide mentors and counseling to help students achieve better self perceptions and provide positive role models. This study was designed to explorer the underrepresentation of minorities and women in the computer science major at Rowan University through a multilayered action research methodology. The purpose of this research study was to define and understand the needs of underrepresented students in computer science, to examine current policies and enrollment data for Rowan University, to develop a historical profile of the Computer Science program from the standpoint of ethnicity and gender enrollment to ascertain trends in students' choice of computer science as a major, and an attempt to determine if raising awareness about computer science for incoming freshmen, and providing an alternate route into the computer science major will entice more women and minorities to pursue a degree in computer science at Rowan University. Finally, this study examined my espoused leadership theories and my leadership theories in use through reflective practices as I progressed through the cycles of this project. The outcomes of this study indicated a large downward trend in women enrollment in computer science and a relatively flat trend in minority enrollment. The enrollment data at Rowan University was found to follow a nationwide trend for underrepresented students' enrollment in STEM majors. The study also indicated that students' mental models are based upon their race and gender socialization and their understanding of the world and society. The mental models were shown to play a large role in the students' choice of major. Finally, a computer science pipeline was designed and piloted as part of this study in an attempt to entice more students into the major and facilitate their success. Additionally, the mental models of the participants were challenged through interactions to make them aware of what possibilities are available with a degree in computer science. The entire study was wrapped in my leadership, which was practiced and studied over the course of this work.
Power monitoring and control for large scale projects: SKA, a case study
NASA Astrophysics Data System (ADS)
Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis
2016-07-01
Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.
Visual analytics as a translational cognitive science.
Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard
2011-07-01
Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.
The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath
Ellisman, M.; Hutton, T.; Kirkland, A.; Lin, A.; Lin, C.; Molina, T.; Peltier, S.; Singh, R.; Tang, K.; Trefethen, A.E.; Wallom, D.C.H.; Xiong, X.
2009-01-01
The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients. PMID:19487201
The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath.
Ellisman, M; Hutton, T; Kirkland, A; Lin, A; Lin, C; Molina, T; Peltier, S; Singh, R; Tang, K; Trefethen, A E; Wallom, D C H; Xiong, X
2009-07-13
The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dress, W.B.
Rosen's modeling relation is embedded in Popper's three worlds to provide an heuristic tool for model building and a guide for thinking about complex systems. The utility of this construct is demonstrated by suggesting a solution to the problem of pseudo science and a resolution of the famous Bohr-Einstein debates. A theory of bizarre systems is presented by an analogy with entangled particles of quantum mechanics. This theory underscores the poverty of present-day computational systems (e.g., computers) for creating complex and bizarre entities by distinguishing between mechanism and organism.
Meta!Blast computer game: a pipeline from science to 3D art to education
NASA Astrophysics Data System (ADS)
Schneller, William; Campbell, P. J.; Bassham, Diane; Wurtele, Eve Syrkin
2012-03-01
Meta!Blast (http://www.metablast.org) is designed to address the challenges students often encounter in understanding cell and metabolic biology. Developed by faculty and students in biology, biochemistry, computer science, game design, pedagogy, art and story, Meta!Blast is being created using Maya (http://usa.autodesk.com/maya/) and the Unity 3D (http://unity3d.com/) game engine, for Macs and PCs in classrooms; it has also been exhibited in an immersive environment. Here, we describe the pipeline from protein structural data and holographic information to art to the threedimensional (3D) environment to the game engine, by which we provide a publicly-available interactive 3D cellular world that mimics a photosynthetic plant cell.
The assessment of virtual reality for human anatomy instruction
NASA Technical Reports Server (NTRS)
Benn, Karen P.
1994-01-01
This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.
NASA Astrophysics Data System (ADS)
Lawton, B.; Hemenway, M. K.; Mendez, B.; Odenwald, S.
2013-04-01
Among NASA's major education goals is the training of students in the Science, Technology, Engineering, and Math (STEM) disciplines. The use of real data, from some of the most sophisticated observatories in the world, provides formal educators the opportunity to teach their students real-world applications of the STEM subjects. Combining real space science data with lessons aimed at meeting state and national education standards provides a memorable educational experience that students can build upon throughout their academic careers. Many of our colleagues have adopted the use of real data in their education and public outreach (EPO) programs. There are challenges in creating resources using real data for classroom use that include, but are not limited to, accessibility to computers/Internet and proper instruction. Understanding and sharing these difficulties and best practices with the larger EPO community is critical to the development of future resources. In this session, we highlight three examples of how NASA data is being utilized in the classroom: the Galaxies and Cosmos Explorer Tool (GCET) that utilizes real Hubble Space Telescope data; the computer image-analysis resources utilized by the NASA WISE infrared mission; and the space science derived math applications from SpaceMath@NASA featuring the Chandra and Kepler space telescopes. Challenges and successes are highlighted for these projects. We also facilitate small-group discussions that focus on additional benefits and challenges of using real data in the formal education environment. The report-outs from those discussions are given here.
NASA Technical Reports Server (NTRS)
Biswas, Rupak
2018-01-01
Quantum computing promises an unprecedented ability to solve intractable problems by harnessing quantum mechanical effects such as tunneling, superposition, and entanglement. The Quantum Artificial Intelligence Laboratory (QuAIL) at NASA Ames Research Center is the space agency's primary facility for conducting research and development in quantum information sciences. QuAIL conducts fundamental research in quantum physics but also explores how best to exploit and apply this disruptive technology to enable NASA missions in aeronautics, Earth and space sciences, and space exploration. At the same time, machine learning has become a major focus in computer science and captured the imagination of the public as a panacea to myriad big data problems. In this talk, we will discuss how classical machine learning can take advantage of quantum computing to significantly improve its effectiveness. Although we illustrate this concept on a quantum annealer, other quantum platforms could be used as well. If explored fully and implemented efficiently, quantum machine learning could greatly accelerate a wide range of tasks leading to new technologies and discoveries that will significantly change the way we solve real-world problems.
Earth System Grid II, Turning Climate Datasets into Community Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Don
2006-08-01
The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less
Learning Science in Grades 3 8 Using Probeware and Computers: Findings from the TEEMSS II Project
NASA Astrophysics Data System (ADS)
Zucker, Andrew A.; Tinker, Robert; Staudt, Carolyn; Mansfield, Amie; Metcalf, Shari
2008-02-01
The Technology Enhanced Elementary and Middle School Science II project (TEEMSS), funded by the National Science Foundation, produced 15 inquiry-based instructional science units for teaching in grades 3-8. Each unit uses computers and probeware to support students' investigations of real-world phenomena using probes (e.g., for temperature or pressure) or, in one case, virtual environments based on mathematical models. TEEMSS units were used in more than 100 classrooms by over 60 teachers and thousands of students. This paper reports on cases in which groups of teachers taught science topics without TEEMSS materials in school year 2004-2005 and then the same teachers taught those topics using TEEMSS materials in 2005-2006. There are eight TEEMSS units for which such comparison data are available. Students showed significant learning gains for all eight. In four cases (sound and electricity, both for grades 3-4; temperature, grades 5-6; and motion, grades 7-8) there were significant differences in science learning favoring the students who used the TEEMSS materials. The effect sizes are 0.58, 0.94, 1.54, and 0.49, respectively. For the other four units there were no significant differences in science learning between TEEMSS and non-TEEMSS students. We discuss the implications of these results for science education.
Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebner, Gregory A.
Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less
The quantum computer game: citizen science
NASA Astrophysics Data System (ADS)
Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob
2013-05-01
Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.
Extreme Science (LBNL Science at the Theater)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ajo-Franklin, Caroline; Klein, Spencer; Minor, Andrew
On Feb. 27, 2012 at the Berkeley Repertory Theatre, four Berkeley Lab scientists presented talks related to extreme science - and what it means to you. Topics include: Neutrino hunting in Antarctica. Learn why Spencer Klein goes to the ends of the Earth to search for these ghostly particles. From Chernobyl to Central Asia, Tamas Torok travels the globe to study microbial diversity in extreme environments. Andrew Minor uses the world's most advanced electron microscopes to explore materials at ultrahigh stresses and in harsh environments. And microbes that talk to computers? Caroline Ajo-Franklin is pioneering cellular-electrical connections that could helpmore » transform sunlight into fuel.« less
Using Computers in Introductory Astronomy Courses
NASA Astrophysics Data System (ADS)
Deming, Grace L.
1995-12-01
Computer literacy is fast becoming a focal point in undergraduate education. Scientific literacy has been a continuing goal of undergraduate programs across the nation and a course in introductory astronomy is often used to satisfy such science requirements. At U. MD an introduction to computer skills is being integrated into our astronomy curriculum for non-science majors. The campus is adequately equipped with computer labs, yet many students enter college without basic computer skills. In Astronomy 101 (General Astronomy) students are introduced to electronic mail, a Listserver, and the world wide web. Students in this course are required to register for a free campus computer account. Their first assignment is to use e-mail to subscribe to the class Listserver, Milkyway. Through Milkyway, students have access to weekly lecture summaries, questions to review for exams, and copies of previous exams. Using e-mail students may pose questions, provide comments, or exchange opinions using Milkyway, or they may e-mail the instructor directly. Studies indicate that using e-mail is less intimidating to a student than asking a question in a class of 200 students. Monitoring e-mail for student questions has not been a problem. Student reaction has been favorable to using e-mail, since instructor office hours are not always convenient, especially to commuting or working students. Through required assignments, students receive an introduction to accessing information on the world wide web using Netscape. Astronomy has great resources available on the Internet which can be used to supplement and reinforce introductory material. Assignments are structured so that students will gain the techniques necessary to access available information. It is hoped that students will successfully apply the computer skills they learn in astronomy class to their own fields and as life-long learners. We have found that students comfortable with computers are willing to share their knowledge with others. The computer activities have been structured to promote cooperation between students. These skills are also necessary for success.
ERIC Educational Resources Information Center
Arnold, Savittree Rochanasmita; Padilla, Michael J.; Tunhikorn, Bupphachart
2009-01-01
In the rapidly developing digital world, technology is and will be a force in workplaces, communities, and everyday lives in the 21st century. Information and Communication Technology (ICT) including computer hardware/software, networking and other technologies such as audio, video, and other multimedia tools became learning tools for students in…
ERIC Educational Resources Information Center
Pododimenko, Inna
2014-01-01
The most urgent problem of training competitive specialists in higher educational establishments in the conditions of socio-economical dynamics of transformation of Ukraine and its entry into the world society has been considered. On the basis of professional requirements' analysis the row of contradictions and disparities among the specialists in…
ERIC Educational Resources Information Center
Pan, Edward A.
2013-01-01
Science, technology, engineering, and mathematics (STEM) education is a national focus. Engineering education, as part of STEM education, needs to adapt to meet the needs of the nation in a rapidly changing world. Using computer-based visualization tools and corresponding 3D printed physical objects may help nontraditional students succeed in…
Language Maintenance on the Internet
ERIC Educational Resources Information Center
Ward, Judit Hajnal; Agocs, Laszlo
2004-01-01
Due to the expanding use of computer networks in Hungary, the Hungarian language has become a grown-up member of the World Wide Web and the Internet. In the past few years, the number of web pages written in Hungarian has significantly increased, since all areas of business, science, education, culture, etc., are eager to make use of the evolving…
Influencing the Self-Efficacy of Middle Eastern Women through the Use of a Bulletin Board
ERIC Educational Resources Information Center
Alkhalifa, Eshaa
2008-01-01
Gender studies across the world have produced a wealth of information generated by studies that seek to investigate the existence of a distinction between genders in mathematical-based courses, such as Computer Science courses. However, the Middle Eastern Region remained unexplored largely throughout this effort due to gender segregation during…
Large Scale GW Calculations on the Cori System
NASA Astrophysics Data System (ADS)
Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven
The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.
Educating Physicists for the 21st Century Industrial Arena
NASA Astrophysics Data System (ADS)
Levine, Alaina G.
2001-03-01
At the University of Arizona, a new Professional Master's Degree in Applied and Industrial Physics has been initiated to meet the demands of a new industrial era. A 1995 report by the National Academy of Sciences, et al, concluded, "A world of work that has become more interdisciplinary, collaborative, and global requires that we produce young people who are adaptable and flexible, as well as technically proficient." To better prepare students for this new "world of work", a new degree was launched in 2000 sponsored by the Sloan Foundation as part of a national initiative. The Professional Master's Degree in Applied and Industrial Physics educates students to 1) work in interdisciplinary teams on complex problems involving rapidly changing science and technology, 2) gain proficiency in computational techniques, 3) effectively communicate their scientific mission at all levels, and 4) understand business and legal issues associated with their scientific projects. I will discuss these goals, the roles of our industrial partners, and Arizona's parallel programs in Applied Biosciences and Mathematical Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alchorn, A L
Thank you for your interest in the activities of the Lawrence Livermore National Laboratory Computation Directorate. This collection of articles from the Laboratory's Science & Technology Review highlights the most significant computational projects, achievements, and contributions during 2002. In 2002, LLNL marked the 50th anniversary of its founding. Scientific advancement in support of our national security mission has always been the core of the Laboratory. So that researchers could better under and predict complex physical phenomena, the Laboratory has pushed the limits of the largest, fastest, most powerful computers in the world. In the late 1950's, Edward Teller--one of themore » LLNL founders--proposed that the Laboratory commission a Livermore Advanced Research Computer (LARC) built to Livermore's specifications. He tells the story of being in Washington, DC, when John Von Neumann asked to talk about the LARC. He thought Teller wanted too much memory in the machine. (The specifications called for 20-30,000 words.) Teller was too smart to argue with him. Later Teller invited Von Neumann to the Laboratory and showed him one of the design codes being prepared for the LARC. He asked Von Neumann for suggestions on fitting the code into 10,000 words of memory, and flattered him about ''Labbies'' not being smart enough to figure it out. Von Neumann dropped his objections, and the LARC arrived with 30,000 words of memory. Memory, and how close memory is to the processor, is still of interest to us today. Livermore's first supercomputer was the Remington-Rand Univac-1. It had 5600 vacuum tubes and was 2 meters wide by 4 meters long. This machine was commonly referred to as a 1 KFlop machine [E+3]. Skip ahead 50 years. The ASCI White machine at the Laboratory today, produced by IBM, is rated at a peak performance of 12.3 TFlops or E+13. We've improved computer processing power by 10 orders of magnitude in 50 years, and I do not believe there's any reason to think we won't improve another 10 orders of magnitude in the next 50 years. For years I have heard talk of hitting the physical limits of Moore's Law, but new technologies will take us into the next phase of computer processing power such as 3-D chips, molecular computing, quantum computing, and more. Big computers are icons or symbols of the culture and larger infrastructure that exists at LLNL to guide scientific discovery and engineering development. We have dealt with balance issues for 50 years and will continue to do so in our quest for a digital proxy of the properties of matter at extremely high temperatures and pressures. I believe that the next big computational win will be the merger of high-performance computing with information management. We already create terabytes--soon to be petabytes--of data. Efficiently storing, finding, visualizing and extracting data and turning that into knowledge which aids decision-making and scientific discovery is an exciting challenge. In the meantime, please enjoy this retrospective on computational physics, computer science, advanced software technologies, and applied mathematics performed by programs and researchers at LLNL during 2002. It offers a glimpse into the stimulating world of computational science in support of the national missions and homeland defense.« less
Safdari, Reza; Shahmoradi, Leila; Hosseini-beheshti, Molouk-sadat; Nejad, Ahmadreza Farzaneh; Hosseiniravandi, Mohammad
2015-01-01
Introduction: Encyclopedias and their compilation have become so prevalent as a valid cultural medium in the world. The daily development of computer industry and the expansion of various sciences have made indispensable the compilation of electronic, specialized encyclopedias, especially the web-based ones. Materials and Methods: This is an applied-developmental study conducted in 2014. First, the main terms in the field of medical informatics were gathered using MeSH Online 2014 and the supplementary terms of each were determined, and then the tree diagram of the terms was drawn based on their relationship in MeSH. Based on the studies done by the researchers, the tree diagram of the encyclopedia was drawn with respect to the existing areas in this field, and the terms gathered were put in related domains. Findings: In MeSH, 75 preferred terms together with 249 supplementary ones were indexed. One of the informatics’ sub-branches is biomedical informatics and health which itself consists of three sub-divisions of bioinformatics, clinical informatics, and health informatics. Medical informatics which is a subdivision of clinical informatics has developed from the three fields of medical sciences, management and social sciences, and computational sciences and mathematics. Results and Discussion: Medical Informatics is created of confluence and fusion and applications of the three major scientific branches include health and biological sciences, social sciences and management sciences, computing and mathematical sciences, and according to that the structure of MeSH is weak for future development of Encyclopedia of Medical Informatics. PMID:26635440
NASA Astrophysics Data System (ADS)
Masson, Steve; Vázquez-Abad, Jesús
2006-10-01
This paper proposes a new way to integrate history of science in science education to promote conceptual change by introducing the notion of historical microworld, which is a computer-based interactive learning environment respecting historic conceptions. In this definition, "interactive" means that the user can act upon the virtual environment by changing some parameters to see what ensues. "Environment respecting historic conceptions" means that the "world" has been programmed to respect the conceptions of past scientists or philosophers. Three historical microworlds in the field of mechanics are presented in this article: an Aristotelian microworld respecting Aristotle's conceptions about movement, a Buridanian microworld respecting the theory of impetus and, finally, a Newtonian microworld respecting Galileo's conceptions and Newton's laws of movement.
The quickening of science communication.
Lucky, R
2000-07-14
In this month's essay, Robert Lucky examines the central sociological impacts that communications technologies have had on the way science is done as well as the critical influences science has had in the evolution of communications technology. He traces the evolution of today's infrastructure for research and collaboration in science via the Internet and the World Wide Web back to the invention of the telegraph, which first freed the flow of information from its reliance on the physical means of transportation and allowed communication to occur in real time. According to Lucky, the remaining technical hurdles in providing unlimited bandwidth are relatively simple to overcome compared with the sociotechnical engineering required to improve the three dimensions of communications--human to information, human to human, and human to computer.
Science at the Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
White, Nicholas E.
2012-01-01
The Sciences and Exploration Directorate of the NASA Goddard Space Flight Center (GSFC) is the largest Earth and space science research organization in the world. Its scientists advance understanding of the Earth and its life-sustaining environment, the Sun, the solar system, and the wider universe beyond. Researchers in the Sciences and Exploration Directorate work with engineers, computer programmers, technologists, and other team members to develop the cutting-edge technology needed for space-based research. Instruments are also deployed on aircraft, balloons, and Earth's surface. I will give an overview of the current research activities and programs at GSFC including the James Web Space Telescope (JWST), future Earth Observing programs, experiments that are exploring our solar system and studying the interaction of the Sun with the Earth's magnetosphere.
Characterizing Mobility and Contact Networks in Virtual Worlds
NASA Astrophysics Data System (ADS)
Machado, Felipe; Santos, Matheus; Almeida, Virgílio; Guedes, Dorgival
Virtual worlds have recently gained wide recognition as an important field of study in Computer Science. In this work we present an analysis of the mobility and interactions among characters in World of Warcraft (WoW) and Second Life based on the contact opportunities extracted from actual user data in each of those domains. We analyze character contacts in terms of their spatial and temporal characteristics, as well as the social network derived from such contacts. Our results show that the contacts observed may be more influenced by the nature of the interactions and goals of the users in each situation than by the intrinsic structure of such worlds. In particular, observations from a city in WoW are closer to those of Second Life than to other areas in WoW itself.
[The Durkheim Test. Remarks on Susan Leigh Star's Boundary Objects].
Gießmann, Sebastian
2015-09-01
The article reconstructs Susan Leigh Star's conceptual work on the notion of 'boundary objects'. It traces the emergence of the concept, beginning with her PhD thesis and its publication as Regions of the Mind in 1989. 'Boundary objects' attempt to represent the distributed, multifold nature of scientific work and its mediations between different 'social worlds'. Being addressed to several 'communities of practice', the term responded to questions from Distributed Artificial Intelligence in Computer Science, Workplace Studies and Computer Supported Cooperative Work (CSCW), and microhistorical approaches inside the growing Science and Technology Studies. Yet the interdisciplinary character and interpretive flexibility of Star’s invention has rarely been noticed as a conceptual tool for media theory. I therefore propose to reconsider Star's 'Durkheim test' for sociotechnical media practices.
NASA Astrophysics Data System (ADS)
Gintautas, Vadas; Hubler, Alfred
2006-03-01
As worldwide computer resources increase in power and decrease in cost, real-time simulations of physical systems are becoming increasingly prevalent, from laboratory models to stock market projections and entire ``virtual worlds'' in computer games. Often, these systems are meticulously designed to match real-world systems as closely as possible. We study the limiting behavior of a virtual horizontally driven pendulum coupled to its real-world counterpart, where the interaction occurs on a time scale that is much shorter than the time scale of the dynamical system. We find that if the physical parameters of the virtual system match those of the real system within a certain tolerance, there is a qualitative change in the behavior of the two-pendulum system as the strength of the coupling is increased. Applications include a new method to measure the physical parameters of a real system and the use of resonance spectroscopy to refine a computer model. As virtual systems better approximate real ones, even very weak interactions may produce unexpected and dramatic behavior. The research is supported by the National Science Foundation Grant No. NSF PHY 01-40179, NSF DMS 03-25939 ITR, and NSF DGE 03-38215.
Crutchfield, James P; Ditto, William L; Sinha, Sudeshna
2010-09-01
How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.
NASA Astrophysics Data System (ADS)
Showstack, Randy
In 17 months, the ball drops in New York's Times Square to usher in a new millennium and new year ending in the digits 00. However, internal clocks in computers around the world may recognize the date as 1900 rather than 2000 if governments and businesses drop the ball in dealing with a simple computer design flaw that has ballooned into a complex management issue of correcting billions of lines of computer code worldwide.In a speech at the National Academy of Sciences in Washington, D.C, in July, U.S. President Bill Clinton proposed new legislation to make it easier for the private sector to collaborate in solving this problem.
How Data Becomes Physics: Inside the RACF
Ernst, Michael; Rind, Ofer; Rajagopalan, Srini; Lauret, Jerome; Pinkenburg, Chris
2018-06-22
The RHIC & ATLAS Computing Facility (RACF) at the U.S. Department of Energyâs (DOE) Brookhaven National Laboratory sits at the center of a global computing network. It connects more than 2,500 researchers around the world with the data generated by millions of particle collisions taking place each second at Brookhaven Lab's Relativistic Heavy Ion Collider (RHIC, a DOE Office of Science User Facility for nuclear physics research), and the ATLAS experiment at the Large Hadron Collider in Europe. Watch this video to learn how the people and computing resources of the RACF serve these scientists to turn petabytes of raw data into physics discoveries.
Trends in micro- and nanoComputed Tomography 2008-2010
NASA Astrophysics Data System (ADS)
Stock, S. R.
2010-09-01
Trends in micro- and nanoComputed Tomography (CT) from January 2008 through July 2010 are the subject of this brief report which takes up where a previous report in Developments in X-ray Tomography VI (2008) concluded. First, the number of systems operating world-wide is estimated. The main focus is on what searches of three citation indices (Web of Science, Compendex and PubMed) reveal about the field of micro- and nanoCT. Given research-fielddependent and disparate terminology used by investigators, searches were on "microtomography", "microCT" and "synchrotron tomography".
Careers in Applied Mathematics and Computational Sciences.
ERIC Educational Resources Information Center
Society for Industrial and Applied Mathematics, Philadelphia, PA.
This booklet provides some answers to questions on how mathematics is used in the world of work, what kinds of problems it solves, and why it is the key to so many careers, particularly to the jobs of the 21st century. Part of that preparation is mathematical knowledge, tools such as derivatives, probability, and matrices as well as central themes…
Unplugged Cybersecurity: An Approach for Bringing Computer Science into the Classroom
ERIC Educational Resources Information Center
Fees, Rachel E.; da Rosa, Jennifer A.; Durkin, Sarah S.; Murray, Mark M.; Moran, Angela L.
2018-01-01
The United States Naval Academy (USNA) STEM Center for Education and Outreach addresses an urgent Navy and national need for more young people to pursue careers in STEM fields through world-wide outreach to 17,000 students and 900 teachers per year. To achieve this mission, the STEM Center has developed a hands-on and inquiry-based methodology to…
ERIC Educational Resources Information Center
Zhang, Xiaohua
2013-01-01
With the development of science and technology, especially the popularity of the Internet, mobile phone, and computer, college students' lifestyle is changing, which leads to the "Zhai" phenomenon. More and more students house in the dorm or home. They reduce the communication with the outside world. This phenomenon not only reduces the…
The Phenomenal World of Physics. The Science Club. Ages 10-14. [CD-ROM].
ERIC Educational Resources Information Center
1999
This CD-ROM allows students to learn about physics principles and the scientists who discovered them through genius or luck. The simplicity of these physical laws and how the discovery of these laws has improved the daily lives of humans is discussed. The computer program explores the physics behind the earth's rotation, Archimedes' Principles,…
Web-Based Instruction in Physics Courses
NASA Astrophysics Data System (ADS)
Wijekumar, V.
1998-05-01
The World Wide Web will be utilized to deliver instructional materials in physics courses in two cases. In one case, a set of physics courses will be entirely taught using WWW for high school science and mathematics teachers in the physics certification program. In the other case, the WWW will be used to enhance the linkage between the laboratory courses in medical physics, human physiology and clinical nursing courses for nursing students. This project links three departments in two colleges to enhance a project known as Integrated Computer System across the Health Science Curriculum. Partial support for this work was provided by the National Science Foundation's Division od Undergraduate Education through grant DUE # 9650793.
International Space Station (ISS)
2001-02-01
The Payload Operations Center (POC) is the science command post for the International Space Station (ISS). Located at NASA's Marshall Space Flight Center in Huntsville, Alabama, it is the focal point for American and international science activities aboard the ISS. The POC's unique capabilities allow science experts and researchers around the world to perform cutting-edge science in the unique microgravity environment of space. The POC is staffed around the clock by shifts of payload flight controllers. At any given time, 8 to 10 flight controllers are on consoles operating, plarning for, and controlling various systems and payloads. This photograph shows a Payload Rack Officer (PRO) at a work station. The PRO is linked by a computer to all payload racks aboard the ISS. The PRO monitors and configures the resources and environment for science experiments including EXPRESS Racks, multiple-payload racks designed for commercial payloads.
NASA Astrophysics Data System (ADS)
The CHAIN-REDS Project is organising a workshop on "e-Infrastructures for e-Sciences" focusing on Cloud Computing and Data Repositories under the aegis of the European Commission and in co-location with the International Conference on e-Science 2013 (IEEE2013) that will be held in Beijing, P.R. of China on October 17-22, 2013. The core objective of the CHAIN-REDS project is to promote, coordinate and support the effort of a critical mass of non-European e-Infrastructures for Research and Education to collaborate with Europe addressing interoperability and interoperation of Grids and other Distributed Computing Infrastructures (DCI). From this perspective, CHAIN-REDS will optimise the interoperation of European infrastructures with those present in 6 other regions of the world, both from a development and use point of view, and catering to different communities. Overall, CHAIN-REDS will provide input for future strategies and decision-making regarding collaboration with other regions on e-Infrastructure deployment and availability of related data; it will raise the visibility of e-Infrastructures towards intercontinental audiences, covering most of the world and will provide support to establish globally connected and interoperable infrastructures, in particular between the EU and the developing regions. Organised by IHEP, INFN and Sigma Orionis with the support of all project partners, this workshop will aim at: - Presenting the state of the art of Cloud computing in Europe and in China and discussing the opportunities offered by having interoperable and federated e-Infrastructures; - Exploring the existing initiatives of Data Infrastructures in Europe and China, and highlighting the Data Repositories of interest for the Virtual Research Communities in several domains such as Health, Agriculture, Climate, etc.
Trelease, Robert B
2016-11-01
Until the late-twentieth century, primary anatomical sciences education was relatively unenhanced by advanced technology and dependent on the mainstays of printed textbooks, chalkboard- and photographic projection-based classroom lectures, and cadaver dissection laboratories. But over the past three decades, diffusion of innovations in computer technology transformed the practices of anatomical education and research, along with other aspects of work and daily life. Increasing adoption of first-generation personal computers (PCs) in the 1980s paved the way for the first practical educational applications, and visionary anatomists foresaw the usefulness of computers for teaching. While early computers lacked high-resolution graphics capabilities and interactive user interfaces, applications with video discs demonstrated the practicality of programming digital multimedia linking descriptive text with anatomical imaging. Desktop publishing established that computers could be used for producing enhanced lecture notes, and commercial presentation software made it possible to give lectures using anatomical and medical imaging, as well as animations. Concurrently, computer processing supported the deployment of medical imaging modalities, including computed tomography, magnetic resonance imaging, and ultrasound, that were subsequently integrated into anatomy instruction. Following its public birth in the mid-1990s, the World Wide Web became the ubiquitous multimedia networking technology underlying the conduct of contemporary education and research. Digital video, structural simulations, and mobile devices have been more recently applied to education. Progressive implementation of computer-based learning methods interacted with waves of ongoing curricular change, and such technologies have been deemed crucial for continuing medical education reforms, providing new challenges and opportunities for anatomical sciences educators. Anat Sci Educ 9: 583-602. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.
Virtual Labs and Virtual Worlds
NASA Astrophysics Data System (ADS)
Boehler, Ted
2006-12-01
Virtual Labs and Virtual Worlds Coastline Community College has under development several virtual lab simulations and activities that range from biology, to language labs, to virtual discussion environments. Imagine a virtual world that students enter online, by logging onto their computer from home or anywhere they have web access. Upon entering this world they select a personalized identity represented by a digitized character (avatar) that can freely move about, interact with the environment, and communicate with other characters. In these virtual worlds, buildings, gathering places, conference rooms, labs, science rooms, and a variety of other “real world” elements are evident. When characters move about and encounter other people (players) they may freely communicate. They can examine things, manipulate objects, read signs, watch video clips, hear sounds, and jump to other locations. Goals of critical thinking, social interaction, peer collaboration, group support, and enhanced learning can be achieved in surprising new ways with this innovative approach to peer-to-peer communication in a virtual discussion world. In this presentation, short demos will be given of several online learning environments including a virtual biology lab, a marine science module, a Spanish lab, and a virtual discussion world. Coastline College has been a leader in the development of distance learning and media-based education for nearly 30 years and currently offers courses through PDA, Internet, DVD, CD-ROM, TV, and Videoconferencing technologies. Its distance learning program serves over 20,000 students every year. sponsor Jerry Meisner
Extending Landauer's bound from bit erasure to arbitrary computation
NASA Astrophysics Data System (ADS)
Wolpert, David
The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.
Science at the Theatre - Extreme Science - Promo Video
Klein, Spencer
2017-12-12
On Feb. 27 at 7 pm at the Berkeley Repertory Theatre, join four Berkeley Lab scientists as they discuss extreme science -- and what it means to you. Topics include: Neutrino hunting in Antarctica. Learn why Spencer Klein goes to the ends of the Earth to search for these ghostly particles. From Chernobyl to Central Asia, Tamas Torok travels the globe to study microbial diversity in extreme environments. Andrew Minor uses the world's most advanced electron microscopes to explore materials at ultrahigh stresses and in harsh environments. And microbes that talk to computers? Caroline Ajo-Franklin is pioneering cellular-electrical connections that could help transform sunlight into fuel. Go here for more information and to view videos of previous Science at the Theater events: http://www.lbl.gov/LBL-PID/fobl/
Science at the Theatre - Extreme Science - Promo Video
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Spencer
On Feb. 27 at 7 pm at the Berkeley Repertory Theatre, join four Berkeley Lab scientists as they discuss extreme science -- and what it means to you. Topics include: Neutrino hunting in Antarctica. Learn why Spencer Klein goes to the ends of the Earth to search for these ghostly particles. From Chernobyl to Central Asia, Tamas Torok travels the globe to study microbial diversity in extreme environments. Andrew Minor uses the world's most advanced electron microscopes to explore materials at ultrahigh stresses and in harsh environments. And microbes that talk to computers? Caroline Ajo-Franklin is pioneering cellular-electrical connections thatmore » could help transform sunlight into fuel. Go here for more information and to view videos of previous Science at the Theater events: http://www.lbl.gov/LBL-PID/fobl/« less
Final Report. Institute for Ultralscale Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois
The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Windus, Theresa; Banda, Michael; Devereaux, Thomas
Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. Wemore » could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy Sciences (BES) mission need. Simulation, visualization, and data analysis are crucial for advances in energy science and technology. Revolutionary mathematical, software, and algorithm developments are required in all areas of BES science to take advantage of exascale computing architectures and to meet data analysis, management, and workflow needs. In partnership with ASCR, BES has an emerging and pressing need to develop new and disruptive capabilities in data science. More capable and larger high-performance computing (HPC) and data ecosystems are required to support priority research in BES. Continued success in BES research requires developing the next-generation workforce through education and training and by providing sustained career opportunities.« less
Cellular intelligence: Microphenomenology and the realities of being.
Ford, Brian J
2017-12-01
Traditions of Eastern thought conceptualised life in a holistic sense, emphasising the processes of maintaining health and conquering sickness as manifestations of an essentially spiritual principle that was of overriding importance in the conduct of living. Western science, which drove the overriding and partial eclipse of Eastern traditions, became founded on a reductionist quest for ultimate realities which, in the modern scientific world, has embraced the notion that every living process can be successfully modelled by a digital computer system. It is argued here that the essential processes of cognition, response and decision-making inherent in living cells transcend conventional modelling, and microscopic studies of organisms like the shell-building amoebae and the rhodophyte alga Antithamnion reveal a level of cellular intelligence that is unrecognized by science and is not amenable to computer analysis. Copyright © 2017. Published by Elsevier Ltd.
GW Calculations of Materials on the Intel Xeon-Phi Architecture
NASA Astrophysics Data System (ADS)
Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek; Biller, Ariel; Chelikowsky, James R.; Louie, Steven G.
Intel Xeon-Phi processors are expected to power a large number of High-Performance Computing (HPC) systems around the United States and the world in the near future. We evaluate the ability of GW and pre-requisite Density Functional Theory (DFT) calculations for materials on utilizing the Xeon-Phi architecture. We describe the optimization process and performance improvements achieved. We find that the GW method, like other higher level Many-Body methods beyond standard local/semilocal approximations to Kohn-Sham DFT, is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-waves, band-pairs and frequencies. Support provided by the SCIDAC program, Department of Energy, Office of Science, Advanced Scientic Computing Research and Basic Energy Sciences. Grant Numbers DE-SC0008877 (Austin) and DE-AC02-05CH11231 (LBNL).
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen
2002-02-01
In 2004, the European COLUMBUS Module is to be attached to the International Space Station. On the way to the successful planning, deployment and operation of the module, computer generated and animated models are being used to optimize performance. Under contract of the German Space Agency DLR, it has become IRF's task to provide a Projective Virtual Reality System to provide a virtual world built after the planned layout of the COLUMBUS module let astronauts and experimentators practice operational procedures and the handling of experiments. The key features of the system currently being realized comprise the possibility for distributed multi-user access to the virtual lab and the visualization of real-world experiment data. Through the capabilities to share the virtual world, cooperative operations can be practiced easily, but also trainers and trainees can work together more effectively sharing the virtual environment. The capability to visualize real-world data will be used to introduce measured data of experiments into the virtual world online in order to realistically interact with the science-reference model hardware: The user's actions in the virtual world are translated into corresponding changes of the inputs of the science reference model hardware; the measured data is than in turn fed back into the virtual world. During the operation of COLUMBUS, the capabilities for distributed access and the capabilities to visualize measured data through the use of metaphors and augmentations of the virtual world may be used to provide virtual access to the COLUMBUS module, e.g. via Internet. Currently, finishing touches are being put to the system. In November 2001 the virtual world shall be operational, so that besides the design and the key ideas, first experimental results can be presented.
Auroras and Space Weather Celebrating the International Heliophysics Year in Classroom
NASA Astrophysics Data System (ADS)
Craig, N.; Peticolas, L. M.; Angelopoulos, V.; Thompson, B.
2007-05-01
2007 Celebrates the International Heliophysics year and its outreach has a primary objective, to "demonstrate the beauty, relevance and significance of Space and Earth Science to the world." NASA's first five-satellite mission, THEMIS (Time History of Events and Macroscale Interactions during Substorms), was launched on February 17, 2007 and is to investigate a key mystery surrounding the dynamics of the auroras- when, where, and how are they triggered? When the five probes align perfectly over the North American continent- every four days - and with 20 ground stations in Northern Canada and Alaska with automated, all-sky cameras will document the auroras from Earth. To monitor the large-scale local effects of the currents in space, THEMIS Education and Outreach program has installed 10 ground magnetometers, instruments that measure Earth's magnetic field, in competitively selected rural schools around the country and receive data. The THEMIS Education and Outreach Program shares the IHY objective by bringing in this live local space weather data in the classrooms and engaging the teachers and students on authentic research in the classroom. The data are displayed on the school computer monitors as well as on the THEMIS E/PO website providing the local data to the science mission as well as schools. Teachers use the data to teach about the aurora not only in math and science, but also in Earth science, history and art. These students and their teachers are our ambassadors to rural America and share the excitement of learning and teaching with their regional teachers. We will share how authentic space science data related to Earth's magnetic field and auroras can be understood, researched, predicted and shared via the internet to any school around the globe that wished to be part of tracking solar storms. Complimenting IHY, World Space Week will take place from October 4-10th and this year. World Space week is "an international celebration of science and technology, and their contribution to the betterment of the human condition." THEMIS will take part in World Space Week as a feature science mission with its education program contributing materials to the project so that students around the world can learn more about Earth's magnetic field, magnetic storms and substorms, and beautiful auroras. To facilitate the use of some of our magnetism materials around the world, we will provide some of our activities in German and Spanish on the web.
ERIC Educational Resources Information Center
Thomas, Ally
2016-01-01
With the advent of the newly developed Common Core State Standards and the Next Generation Science Standards, innovative assessments, including technology-enhanced items and tasks, will be needed to meet the challenges of developing valid and reliable assessments in a world of computer-based testing. In a recent critique of the next generation…
Gregarious Convection and Radiative Feedbacks in Idealized Worlds
2016-08-29
exist neither on the globe nor within the cloud model. Since mesoscales impose great computational costs on atmosphere models, as well as inconven...Atmospheric Science, University of Miami, Miami, Florida, USA Abstract What role does convection play in cloud feedbacks? What role does convective... cloud fields depends systematically on global temperature, then convective organization could be a climate system feedback. How reconcilable and how
International Symposium on Grids and Clouds (ISGC) 2014
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2014 will be held at Academia Sinica in Taipei, Taiwan from 23-28 March 2014, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC).“Bringing the data scientist to global e-Infrastructures” is the theme of ISGC 2014. The last decade has seen the phenomenal growth in the production of data in all forms by all research communities to produce a deluge of data from which information and knowledge need to be extracted. Key to this success will be the data scientist - educated to use advanced algorithms, applications and infrastructures - collaborating internationally to tackle society’s challenges. ISGC 2014 will bring together researchers working in all aspects of data science from different disciplines around the world to collaborate and educate themselves in the latest achievements and techniques being used to tackle the data deluge. In addition to the regular workshops, technical presentations and plenary keynotes, ISGC this year will focus on how to grow the data science community by considering the educational foundation needed for tomorrow’s data scientist. Topics of discussion include Physics (including HEP) and Engineering Applications, Biomedicine & Life Sciences Applications, Earth & Environmental Sciences & Biodiversity Applications, Humanities & Social Sciences Application, Virtual Research Environment (including Middleware, tools, services, workflow, ... etc.), Data Management, Big Data, Infrastructure & Operations Management, Infrastructure Clouds and Virtualisation, Interoperability, Business Models & Sustainability, Highly Distributed Computing Systems, and High Performance & Technical Computing (HPTC).
Joined-up Planetary Information, in the Cloud and on Devices.
NASA Astrophysics Data System (ADS)
Smith, M. J.; Emmott, S.; Purves, D. W.; Joppa, L. N.; Lyutsarev, V.
2014-12-01
In scientific research and development, emphasis is placed on research over development. A significant cost is that the two-way interaction between scientific insights and societal needs does not function effectively to lead to impacts in the wider world. We simply must embrace new software and hardware approaches if we are to provide timely predictive information to address global problems, support businesses and inform governments and citizens. The Microsoft Research Computational Science Lab has been pioneering research into software and methodologies to provide useful and usable new environmental information. Our approach has been very joined-up: from accellerating data acquisition from the field with remote sensor technology, targetted data collection and citizen science, to enabling proces based modelling-using multiple heterogeneous data-sets in the cloud and enabling the resulting planetary information to be accessed from any device. This talk will demonstrate some of the specific research and development we are doing to accerate the pace in which important science has impact on the wider world and will emphasise the important insights gained from advancing the research and develoment together.
Hafner, Jürgen
2010-09-29
During the last 20 years computer simulations based on a quantum-mechanical description of the interactions between electrons and atomic nuclei have developed an increasingly important impact on materials science, not only in promoting a deeper understanding of the fundamental physical phenomena, but also enabling the computer-assisted design of materials for future technologies. The backbone of atomic-scale computational materials science is density-functional theory (DFT) which allows us to cast the intractable complexity of electron-electron interactions into the form of an effective single-particle equation determined by the exchange-correlation functional. Progress in DFT-based calculations of the properties of materials and of simulations of processes in materials depends on: (1) the development of improved exchange-correlation functionals and advanced post-DFT methods and their implementation in highly efficient computer codes, (2) the development of methods allowing us to bridge the gaps in the temperature, pressure, time and length scales between the ab initio calculations and real-world experiments and (3) the extension of the functionality of these codes, permitting us to treat additional properties and new processes. In this paper we discuss the current status of techniques for performing quantum-based simulations on materials and present some illustrative examples of applications to complex quasiperiodic alloys, cluster-support interactions in microporous acid catalysts and magnetic nanostructures.
The emergence of mind and brain: an evolutionary, computational, and philosophical approach.
Mainzer, Klaus
2008-01-01
Modern philosophy of mind cannot be understood without recent developments in computer science, artificial intelligence (AI), robotics, neuroscience, biology, linguistics, and psychology. Classical philosophy of formal languages as well as symbolic AI assume that all kinds of knowledge must explicitly be represented by formal or programming languages. This assumption is limited by recent insights into the biology of evolution and developmental psychology of the human organism. Most of our knowledge is implicit and unconscious. It is not formally represented, but embodied knowledge, which is learnt by doing and understood by bodily interacting with changing environments. That is true not only for low-level skills, but even for high-level domains of categorization, language, and abstract thinking. The embodied mind is considered an emergent capacity of the brain as a self-organizing complex system. Actually, self-organization has been a successful strategy of evolution to handle the increasing complexity of the world. Genetic programs are not sufficient and cannot prepare the organism for all kinds of complex situations in the future. Self-organization and emergence are fundamental concepts in the theory of complex dynamical systems. They are also applied in organic computing as a recent research field of computer science. Therefore, cognitive science, AI, and robotics try to model the embodied mind in an artificial evolution. The paper analyzes these approaches in the interdisciplinary framework of complex dynamical systems and discusses their philosophical impact.
Parallel Tensor Compression for Large-Scale Scientific Data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolda, Tamara G.; Ballard, Grey; Austin, Woody Nathan
As parallel computing trends towards the exascale, scientific data produced by high-fidelity simulations are growing increasingly massive. For instance, a simulation on a three-dimensional spatial grid with 512 points per dimension that tracks 64 variables per grid point for 128 time steps yields 8 TB of data. By viewing the data as a dense five way tensor, we can compute a Tucker decomposition to find inherent low-dimensional multilinear structure, achieving compression ratios of up to 10000 on real-world data sets with negligible loss in accuracy. So that we can operate on such massive data, we present the first-ever distributed memorymore » parallel implementation for the Tucker decomposition, whose key computations correspond to parallel linear algebra operations, albeit with nonstandard data layouts. Our approach specifies a data distribution for tensors that avoids any tensor data redistribution, either locally or in parallel. We provide accompanying analysis of the computation and communication costs of the algorithms. To demonstrate the compression and accuracy of the method, we apply our approach to real-world data sets from combustion science simulations. We also provide detailed performance results, including parallel performance in both weak and strong scaling experiments.« less
NASA Astrophysics Data System (ADS)
Helbing, D.; Bishop, S.; Conte, R.; Lukowicz, P.; McCarthy, J. B.
2012-11-01
We have built particle accelerators to understand the forces that make up our physical world. Yet, we do not understand the principles underlying our strongly connected, techno-socio-economic systems. We have enabled ubiquitous Internet connectivity and instant, global information access. Yet we do not understand how it impacts our behavior and the evolution of society. To fill the knowledge gaps and keep up with the fast pace at which our world is changing, a Knowledge Accelerator must urgently be created. The financial crisis, international wars, global terror, the spreading of diseases and cyber-crime as well as demographic, technological and environmental change demonstrate that humanity is facing serious challenges. These problems cannot be solved within the traditional paradigms. Moving our attention from a component-oriented view of the world to an interaction-oriented view will allow us to understand the complex systems we have created and the emergent collective phenomena characterising them. This paradigm shift will enable new solutions to long-standing problems, very much as the shift from a geocentric to a heliocentric worldview has facilitated modern physics and the ability to launch satellites. The FuturICT flagship project will develop new science and technology to manage our future in a complex, strongly connected world. For this, it will combine the power of information and communication technology (ICT) with knowledge from the social and complexity sciences. ICT will provide the data to boost the social sciences into a new era. Complexity science will shed new light on the emergent phenomena in socially interactive systems, and the social sciences will provide a better understanding of the opportunities and risks of strongly networked systems, in particular future ICT systems. Hence, the envisaged FuturICT flagship will create new methods and instruments to tackle the challenges of the 21st century. FuturICT could indeed become one of the most important scientific endeavours ever, by revealing the principles that make socially interactive systems work well, by inspiring the creation of new platforms to explore our possible futures, and by initiating an era of social and socio-inspired innovations.
Empowering the impaired through the appropriate use of Information Technology and Internet.
Sanyal, Ishita
2006-01-01
Developments in the fields of science and technology have revolutionized Human Life at material level. But in actuality, this progress is only superficial: underneath modern men and women are living in conditions of great mental and emotional stress, even in developed and affluent countries. People from all over the world irrespective of culture and economic background suffer from mental illness and though a number of researches are carried out worldwide but till date it has not been possible to resolve the problem. In today's world stress is increasing everyday. The individualistic approach towards life; the neonatal family system has increased the burden even further. Without adequate support system of friends and relatives--people are falling prey to mental illness. The insecurities, the inferiority feelings of these persons lead to disruption of communication between the sufferer and the family members and friends. The sufferers prefer to confine themselves within the four walls of their home and remain withdrawn from the whole world. They prefer to stay in touch with their world of fantasy--far away from the world of reality. Disability caused by some of the mental illnesses often remains invisible to the society leading to lack of support system and facilities for them. These unfortunate disabled persons not only need medication and counseling but a thorough rehabilitation programme to bring them back to the main stream of life. The task being not an easy one. According to the research works these persons need some work and income to improve their quality of life. In this scenario where society is adverse towards them, where stigma towards mental illness prevails; where help from friends and community is not available- training them in computer and forming groups through computer was thought to be an ideal option for the solution- a solution to the problems of modern life through modern technology. * It was seen that this insecure disabled persons feel free to experiment with machine more easily than with society and people. * Computer provides them the needed education and information needed for their further developments. * Computers provide them facilities to interact with others and form self-help groups. * Computers also enabled them to earn their livelihood. Thus this modern gadget, which is sometimes believed to make a man loner, has been actually acting as the bridge between the persons suffering from mental illness to the society in general. The disabled person also gains confidence and courage as they gain control over the machine. Gaining control over the machine helps them to gain control over their life. The product of Science and technology has been seen to revolutionized Human Life not only in material level but also on personal level- helping the disabled to gain control over their lives.
ISCR Annual Report: Fical Year 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGraw, J R
2005-03-03
Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less
NASA Astrophysics Data System (ADS)
de Groot, R.
2008-12-01
The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.
The Backyard Worlds: Planet 9 Citizen Science Project
NASA Astrophysics Data System (ADS)
Faherty, Jacqueline K.; Kuchner, Marc; Schneider, Adam; Meisner, Aaron; Gagné, Jonathan; Filippazzo, Joeseph; Trouille, Laura; Backyard Worlds: Planet 9 Collaboration; Jacqueline Faherty
2018-01-01
In February of 2017 our team launched a new citizen science project entitled Backyard Worlds: Planet 9 to scan the cosmos for fast moving stars, brown dwarfs, and even planets. This Zooniverse website, BackyardWorlds.org, invites anyone with a computer or smartphone to flip through WISE images taken over a several year baseline and mark any point source that appears to move. This “blinking technique” is the same that Clyde Tombaugh discovered Pluto with over 80 years ago. In the first few days of our program we recruited over 30,000 volunteers. After 3/4 of a year with the program we have completed 30% of the sky and our participants have identified several hundred candidate movers. These include (1) over 20 candidate Y-type brown dwarfs, (2) a handful of new co-moving systems containing a previously unidentified low mass object and a known nearby star, (3) over 100 previously missed M dwarfs, (4) and more than 200 candidate L and T brown dwarfs, many of which occupy outlier positions on reduced proper motion diagrams. Our first publication credited four citizen scientists as co-authors. The Backyard Worlds: Planet 9 project is both scientifically fruitful and empowering for any mind across the globe that has ever wanted to participate in a discovery-driven astronomy research project.
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess
2015-01-01
Scientists and engineers constantly face new challenges, despite myriad advances in computing. More sets of data are collected today from earth and sky than there is time or resources available to carefully analyze them. Some problems either don't have fast algorithms to solve them or have solutions that must be found among millions of options, a situation akin to finding a needle in a haystack. But all hope is not lost: advances in technology and the Internet have empowered the general public to participate in the scientific process via individual computational resources and brain cognition, which isn't matched by any machine. Citizen scientists are volunteers who perform scientific work by making observations, collecting and disseminating data, making measurements, and analyzing or interpreting data without necessarily having any scientific training. In so doing, individuals from all over the world can contribute to science in ways that wouldn't have been otherwise possible.
NASA Astrophysics Data System (ADS)
Fraser, Gordon
2006-04-01
Introduction Gordon Fraser; Part I. Matter and the Universe: 1. Cosmology Wendy Freedman and Rocky Kolb; 2. Gravity Ronald Adler; 3. Astrophysics Arnon Dar; 4. Particles and the standard model Chris Quigg; 5. Superstrings Michael Green; Part II. Quantum Matter: 6. Atoms and photons Claude Cohen-Tannoudji and Jean Dalibard; 7. The quantum world of ultra-cold atoms Christopher Foot and William Phillips; 8. Superfluidity Henry Hall; 9. Quantum phase transitions Subir Sachdev; Part III. Quanta in Action: 10. Quantum entanglement Anton Zeilinger; 11. Quanta, ciphers and computers Artur Ekert; 12. Small-scale structure and nanoscience Yoseph Imry; Part IV. Calculation and Computation: 13. Nonlinearity Henry Abarbanel; 14. Complexity Antonio Politi; 15. Collaborative physics, e-science and the grid Tony Hey and Anne Trefethen; Part V. Science in Action: 16. Biophysics Cyrus Safinya; 17. Medical physics Nicolaj Pavel; 18. Physics and materials Robert Cahn; 19. Physics and society Ugo Amaldi.
NASA Astrophysics Data System (ADS)
Fraser, Gordon
2009-08-01
Introduction Gordon Fraser; Part I. Matter and the Universe: 1. Cosmology Wendy Freedman and Rocky Kolb; 2. Gravity Ronald Adler; 3. Astrophysics Arnon Dar; 4. Particles and the standard model Chris Quigg; 5. Superstrings Michael Green; Part II. Quantum Matter: 6. Atoms and photons Claude Cohen-Tannoudji and Jean Dalibard; 7. The quantum world of ultra-cold atoms Christopher Foot and William Phillips; 8. Superfluidity Henry Hall; 9. Quantum phase transitions Subir Sachdev; Part III. Quanta in Action: 10. Quantum entanglement Anton Zeilinger; 11. Quanta, ciphers and computers Artur Ekert; 12. Small-scale structure and nanoscience Yoseph Imry; Part IV. Calculation and Computation: 13. Nonlinearity Henry Abarbanel; 14. Complexity Antonio Politi; 15. Collaborative physics, e-science and the grid Tony Hey and Anne Trefethen; Part V. Science in Action: 16. Biophysics Cyrus Safinya; 17. Medical physics Nicolaj Pavel; 18. Physics and materials Robert Cahn; 19. Physics and society Ugo Amaldi.
ASCR Cybersecurity for Scientific Computing Integrity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piesert, Sean
The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE tomore » execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.« less
Politics, the media and science in HIV/AIDS: the peril of pseudoscience.
Makgoba, Malegapuru W
2002-05-06
The microchip, the computer and the DNA revolution have brought the questions of ethics, counselling and equitable research to the fore. The new world order is a world of: equity; human rights; human dignity; the alleviation of poverty; closing the gap between the "haves and have nots". The social and economic impact and implications of these have opened a new dialogue between the professions and the laypersons in order to address matters of rights, ethics and power relationships in health research that is unprecedented in history. The yearning need for science to be understood by the public; the need for scientists to communicate better; the need for the public to make choices about what science has to offer in their daily life; the need for the public to participate and shape the scientific process; the need for science to integrate the wealth of information that is already existent has never been greater than today. Perhaps no examples illustrate these challenges better than the revolution in biology (the Human Genome Project and embryo stem cell research/therapy) and the human immunodeficiency virus (HIV)/AIDS epidemic that is sweeping sub-Saharan Africa (1). The way we teach, learn and practice science will no longer be the same. It will no longer be business as usual. It is unfortunately also within this context that pseudoscience is likely flourish (2).
Impact of Multimedia and Network Services on an Introductory Level Course
NASA Technical Reports Server (NTRS)
Russ, John C.
1996-01-01
We will demonstrate and describe the impact of our use of multimedia and network connectivity on a sophomore-level introductory course in materials science. This class services all engineering students, resulting in large (more than 150) class sections with no hands-on laboratory. In 1990 we began to develop computer graphics that might substitute for some laboratory or real-world experiences, and demonstrate relationships hard to show with static textbook images or chalkboard drawings. We created a comprehensive series of modules that cover the entire course content. Called VIMS (Visualizations in Materials Science), these are available in the form of a CD-ROM and also via the internet.
Computational Earth Science: Big Data Transformed Into Insight
NASA Astrophysics Data System (ADS)
Sellars, Scott; Nguyen, Phu; Chu, Wei; Gao, Xiaogang; Hsu, Kuo-lin; Sorooshian, Soroosh
2013-08-01
More than ever in the history of science, researchers have at their fingertips an unprecedented wealth of data from continuously orbiting satellites, weather monitoring instruments, ecological observatories, seismic stations, moored buoys, floats, and even model simulations and forecasts. With just an internet connection, scientists and engineers can access atmospheric and oceanic gridded data and time series observations, seismographs from around the world, minute-by-minute conditions of the near-Earth space environment, and other data streams that provide information on events across local, regional, and global scales. These data sets have become essential for monitoring and understanding the associated impacts of geological and environmental phenomena on society.
NASA Technical Reports Server (NTRS)
Davis, Bruce E.; Elliot, Gregory
1989-01-01
Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.
NASA Astrophysics Data System (ADS)
Liu, Wei; Ming, Meng; Lu, Ye; Jin, Wei
2016-04-01
The world's mountains host some of the most complex, dynamic, and diverse ecosystems and are also hotspots for natural disasters, such as earthquake, landslide and flood. One factor that limits the mountain communities to recover from disasters and pursue sustainable development is the lack of locally relevant scientific knowledge, which is hard to gain from global and regional scale observations and models. The rapid advances in ICT, computing, communication technologies and the emergence of citizen science is changing the situation. Here we report a case from Sichuan Giant Panda Sanctuary World Natural Heritage in China on the application of citizen science in a community reconstruction project. Dahe, a mountainous community (ca. 8000 ha in size) is located covering part of the World Heritage's core and buffer zones, with an elevation range of 1000-3000 meters. The community suffered from two major earthquakes of 7.9 and 6.9 Mw in 2008 and 2013 respectively. Landslides and flooding threat the community and significantly limit their livelihood options. We integrated participatory disaster risk mapping (e.g., community vulnerability and capacity assessment) and mobile assisted natural hazards and natural resources mapping (e.g., using free APP GeoODK) into more conventional community reconstruction and livelihood building activities. We showed that better decisions are made based on results from these activities and local residents have a high level of buy-in in these new knowledge. We suggest that initiatives like this, if successfully scale-up, can also help generate much needed data and knowledge in similar less-developed and data deficient regions of the world.
NASA Astrophysics Data System (ADS)
Klopfer, Eric; Scheintaub, Hal; Huang, Wendy; Wendel, Daniel
Computational approaches to science are radically altering the nature of scientific investigatiogn. Yet these computer programs and simulations are sparsely used in science education, and when they are used, they are typically “canned” simulations which are black boxes to students. StarLogo The Next Generation (TNG) was developed to make programming of simulations more accessible for students and teachers. StarLogo TNG builds on the StarLogo tradition of agent-based modeling for students and teachers, with the added features of a graphical programming environment and a three-dimensional (3D) world. The graphical programming environment reduces the learning curve of programming, especially syntax. The 3D graphics make for a more immersive and engaging experience for students, including making it easy to design and program their own video games. Another change to StarLogo TNG is a fundamental restructuring of the virtual machine to make it more transparent. As a result of these changes, classroom use of TNG is expanding to new areas. This chapter is concluded with a description of field tests conducted in middle and high school science classes.
Advances in Cross-Cutting Ideas for Computational Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ng, Esmond; Evans, Katherine J.; Caldwell, Peter
This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less
Advances in Cross-Cutting Ideas for Computational Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ng, E.; Evans, K.; Caldwell, P.
This report presents results from the DOE-sponsored workshop titled, Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less
The Fate of the World is in your hands: computer gaming for multi-faceted climate change education
NASA Astrophysics Data System (ADS)
Bedford, D. P.
2015-12-01
Climate change is a multi-faceted (or 'wicked') problem. True climate literacy therefore requires understanding not only the workings of the climate system, but also the current and potential future impacts of climate change and sea level rise on individuals, communities and countries around the world, as noted in the US Global Change Research Program's (2009) Climate Literacy: The Essential Principles of Climate Sciences. The asymmetric nature of climate change impacts, whereby the world's poorest countries have done the least to cause the problem but will suffer disproportionate consequences, has also been widely noted. Education in climate literacy therefore requires an element of ethics in addition to physical and social sciences. As if addressing these multiple aspects of climate change were not challenging enough, polling data has repeatedly shown that many members of the public tend to see climate change as a far away problem affecting people remote from them at a point in the future, but not themselves. This perspective is likely shared by many students. Computer gaming provides a possible solution to the combined problems of, on the one hand, addressing the multi-faceted nature of climate change, and, on the other hand, making the issue real to students. Fate of the World, a game produced by the company Red Redemption, has been used on several occasions in a small (20-30 students) introductory level general education course on global warming at Weber State University. Players are required to balance difficult decisions about energy investment while managing regional political disputes and attempting to maintain minimum levels of development in the world's poorer countries. By providing a realistic "total immersion" experience, the game has the potential to make climate change issues more immediate to players, and presents them with the ethical dilemmas inherent in climate change. This presentation reports on the use of Fate of the World in an educational setting, highlighting student experiences and lessons learned from two attempts to use the game as a tool for teaching the multi-faceted nature of climate change.
How robotics programs influence young women's career choices : a grounded theory model
NASA Astrophysics Data System (ADS)
Craig, Cecilia Dosh-Bluhm
The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced young women's career decisions and the program's effect on engineering, physics, and computer science career interests. To test this, a study was mounted to explore how the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition (FRC) program influenced young women's college major and career choices. Career theories suggested that experiential programs coupled with supportive relationships strongly influence career decisions, especially for science, technology, engineering, and mathematics careers. The study explored how and when young women made career decisions and how the experiential program and! its mentors and role models influenced career choice. Online focus groups and interviews (online and face-to-face) with 10 female FRC alumnae and GT processes (inductive analysis, open coding, categorizations using mind maps and content clouds) were used to generate a general systems theory style model of the career decision process for these young women. The study identified gender stereotypes and other career obstacles for women. The study's conclusions include recommendations to foster connections to real-world challenges, to develop training programs for mentors, and to nurture social cohesion, a mostly untapped area. Implementing these recommendations could help grow a critical mass of women in engineering, physics, and computer science careers, a social change worth pursuing.
Designing, programming, and optimizing a (small) quantum computer
NASA Astrophysics Data System (ADS)
Svore, Krysta
In 1982, Richard Feynman proposed to use a computer founded on the laws of quantum physics to simulate physical systems. In the more than thirty years since, quantum computers have shown promise to solve problems in number theory, chemistry, and materials science that would otherwise take longer than the lifetime of the universe to solve on an exascale classical machine. The practical realization of a quantum computer requires understanding and manipulating subtle quantum states while experimentally controlling quantum interference. It also requires an end-to-end software architecture for programming, optimizing, and implementing a quantum algorithm on the quantum device hardware. In this talk, we will introduce recent advances in connecting abstract theory to present-day real-world applications through software. We will highlight recent advancement of quantum algorithms and the challenges in ultimately performing a scalable solution on a quantum device.
Archiving and access systems for remote sensing: Chapter 6
Faundeen, John L.; Percivall, George; Baros, Shirley; Baumann, Peter; Becker, Peter H.; Behnke, J.; Benedict, Karl; Colaiacomo, Lucio; Di, Liping; Doescher, Chris; Dominguez, J.; Edberg, Roger; Ferguson, Mark; Foreman, Stephen; Giaretta, David; Hutchison, Vivian; Ip, Alex; James, N.L.; Khalsa, Siri Jodha S.; Lazorchak, B.; Lewis, Adam; Li, Fuqin; Lymburner, Leo; Lynnes, C.S.; Martens, Matt; Melrose, Rachel; Morris, Steve; Mueller, Norman; Navale, Vivek; Navulur, Kumar; Newman, D.J.; Oliver, Simon; Purss, Matthew; Ramapriyan, H.K.; Rew, Russ; Rosen, Michael; Savickas, John; Sixsmith, Joshua; Sohre, Tom; Thau, David; Uhlir, Paul; Wang, Lan-Wei; Young, Jeff
2016-01-01
Focuses on major developments inaugurated by the Committee on Earth Observation Satellites, the Group on Earth Observations System of Systems, and the International Council for Science World Data System at the global level; initiatives at national levels to create data centers (e.g. the National Aeronautics and Space Administration (NASA) Distributed Active Archive Centers and other international space agency counterparts), and non-government systems (e.g. Center for International Earth Science Information Network). Other major elements focus on emerging tool sets, requirements for metadata, data storage and refresh methods, the rise of cloud computing, and questions about what and how much data should be saved. The sub-sections of the chapter address topics relevant to the science, engineering and standards used for state-of-the-art operational and experimental systems.
Advances in natural language processing.
Hirschberg, Julia; Manning, Christopher D
2015-07-17
Natural language processing employs computational techniques for the purpose of learning, understanding, and producing human language content. Early computational approaches to language research focused on automating the analysis of the linguistic structure of language and developing basic technologies such as machine translation, speech recognition, and speech synthesis. Today's researchers refine and make use of such tools in real-world applications, creating spoken dialogue systems and speech-to-speech translation engines, mining social media for information about health or finance, and identifying sentiment and emotion toward products and services. We describe successes and challenges in this rapidly advancing area. Copyright © 2015, American Association for the Advancement of Science.
Clock Agreement Among Parallel Supercomputer Nodes
Jones, Terry R.; Koenig, Gregory A.
2014-04-30
This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.
A Study of Computer Center Management
1988-06-01
the United States and the rest of the western world and do not take into consideration the various economic and culture factors in developing countries...Mortagy, Thesis Advisor John B Isett, Second Reader David R. Whipple -airman Department of . .&;-sation Science mes M.mgen, Act ng Dean of nm_ Jon and oli...take into consideration the various economic and culture factors in developing countries. This thesis seeks to present a number of new techniques in
Developing a World View for Science Education: A Message from the NSTA President
ERIC Educational Resources Information Center
Padilla, Michael
2005-01-01
This article features the message from the president of National Science Teachers Association. With the theme, "Developing a World View for Science Education," the president calls for science teachers to join in developing a world view for science education and nurturing NSTA members into thinking not just with a local, regional, or national…
Fbis report. Science and technology: China, October 18, 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-10-18
;Partial Contents: Nanomaterials Fabrication, Applications Research Advances Noted; CAST Announces World`s First Space-Grown Large-Diameter GaAs Monocrystal; Assay of Antiviral Activity of Antisense Phosphorothioate Oligodeoxynucleotide Against Dengue Virus; Expression and Antigenicity of Chimeric Proteins of Cholera Toxin B Subunit With Hepatitis C Virus; CNCOFIEC Signs Agreement With IBM for New Intelligent Building; Latest Reports on Optical Computing, Memory; BIDC To Introduce S3 Company`s Multimedia Accelerator Chipset; Virtual Private PCN Ring Network Based on ATM VP Cross-Connection; Beijing Gets Nation`s First Frame Relay Network; Situation of Power Industry Development and International Cooperation; Diagrams of China`s Nuclear Waste Containment Vessels; Chinese-Developed Containment Vesselmore » Material Reaches World Standards; Second Fuel Elements for Qinshan Plant Passes Inspection; and Geothermal Deep-Well Electric Pump Technology Developed.« less
CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy
NASA Astrophysics Data System (ADS)
Ball, Nicholas M.
2013-01-01
The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.
NASA Astrophysics Data System (ADS)
Stone, S.; Parker, M. S.; Howe, B.; Lazowska, E.
2015-12-01
Rapid advances in technology are transforming nearly every field from "data-poor" to "data-rich." The ability to extract knowledge from this abundance of data is the cornerstone of 21st century discovery. At the University of Washington eScience Institute, our mission is to engage researchers across disciplines in developing and applying advanced computational methods and tools to real world problems in data-intensive discovery. Our research team consists of individuals with diverse backgrounds in domain sciences such as astronomy, oceanography and geology, with complementary expertise in advanced statistical and computational techniques such as data management, visualization, and machine learning. Two key elements are necessary to foster careers in data science: individuals with cross-disciplinary training in both method and domain sciences, and career paths emphasizing alternative metrics for advancement. We see persistent and deep-rooted challenges for the career paths of people whose skills, activities and work patterns don't fit neatly into the traditional roles and success metrics of academia. To address these challenges the eScience Institute has developed training programs and established new career opportunities for data-intensive research in academia. Our graduate students and post-docs have mentors in both a methodology and an application field. They also participate in coursework and tutorials to advance technical skill and foster community. Professional Data Scientist positions were created to support research independence while encouraging the development and adoption of domain-specific tools and techniques. The eScience Institute also supports the appointment of faculty who are innovators in developing and applying data science methodologies to advance their field of discovery. Our ultimate goal is to create a supportive environment for data science in academia and to establish global recognition for data-intensive discovery across all fields.
Innovation in Science Education - World-Wide.
ERIC Educational Resources Information Center
Baez, Albert V.
The purpose of this book is to promote improvements in science education, world-wide, but particularly in developing countries. It is addressed to those in positions to make effective contributions to the improvement of science education. The world-wide role of science education, the goals of innovative activities, past experience in efforts to…
ERIC Educational Resources Information Center
Gossard, Paula Rae
2009-01-01
Authors of recent science reform documents promote the goal of scientific literacy for all Americans (American Association for the Advancement of Science, 1989, 1993). Some students, however, feel apprehensive about learning science due to perceptions that science is antagonistic to their world views (Alters, 2005; Esbenshade, 1993). This study…
Gait biomechanics in the era of data science.
Ferber, Reed; Osis, Sean T; Hicks, Jennifer L; Delp, Scott L
2016-12-08
Data science has transformed fields such as computer vision and economics. The ability of modern data science methods to extract insights from large, complex, heterogeneous, and noisy datasets is beginning to provide a powerful complement to the traditional approaches of experimental motion capture and biomechanical modeling. The purpose of this article is to provide a perspective on how data science methods can be incorporated into our field to advance our understanding of gait biomechanics and improve treatment planning procedures. We provide examples of how data science approaches have been applied to biomechanical data. We then discuss the challenges that remain for effectively using data science approaches in clinical gait analysis and gait biomechanics research, including the need for new tools, better infrastructure and incentives for sharing data, and education across the disciplines of biomechanics and data science. By addressing these challenges, we can revolutionize treatment planning and biomechanics research by capitalizing on the wealth of knowledge gained by gait researchers over the past decades and the vast, but often siloed, data that are collected in clinical and research laboratories around the world. Copyright © 2016 Elsevier Ltd. All rights reserved.
A New Approach to A Science Magnet School - Classroom and Museum Integration
NASA Astrophysics Data System (ADS)
Franklin, Samuel
2009-03-01
The Pittsburgh Science & Technology Academy is a place where any student with an interest in science, technology, engineering or math can develop skills for a career in life sciences, environmental sciences, computing, or engineering. The Academy isn't just a new school. It's a new way to think about school. The curriculum is tailored to students who have a passion for science, technology, engineering or math. The environment is one of extraordinary support for students, parents, and faculty. And the Academy exists to provide opportunities, every day, for students to Dream. Discover. Design. That is, Academy students set goals and generate ideas, research and discover answers, and design real solutions for the kinds of real-world problems that they'll face after graduation. The Academy prepares students for their future, whether they go on to higher education or immediate employment. This talk will explain the unique features of the Pittsburgh Science & Technology Academy, lessons learned from its two-year design process, and the role that the Carnegie Museums have played and will continue to play as the school grows.
NASA Astrophysics Data System (ADS)
Chen, Jean Chi-Jen
Physics is fundamental for science, engineering, medicine, and for understanding many phenomena encountered in people's daily lives. The purpose of this study was to investigate the relationships between student success in college-level introductory physics courses and various educational and background characteristics. The primary variables of this study were gender, high school mathematics and science preparation, preference and perceptions of learning physics, and performance in introductory physics courses. Demographic characteristics considered were age, student grade level, parents' occupation and level of education, high school senior grade point average, and educational goals. A Survey of Learning Preference and Perceptions was developed to collect the information for this study. A total of 267 subjects enrolled in six introductory physics courses, four algebra-based and two calculus-based, participated in the study conducted during Spring Semester 2002. The findings from the algebra-based physics courses indicated that participant's educational goal, high school senior GPA, father's educational level, mother's educational level, and mother's occupation in the area of science, engineering, or computer technology were positively related to performance while participant age was negatively related. Biology preparation, mathematics preparation, and additional mathematics and science preparation in high school were also positively related to performance. The relationships between the primary variables and performance in calculus-based physics courses were limited to high school senior year GPA and high school physics preparation. Findings from all six courses indicated that participant's educational goal, high school senior GPA, father's educational level, and mother's occupation in the area of science, engineering, or computer technology, high school preparation in mathematics, biology, and the completion of additional mathematics and science courses were positively related to performance. No significant performance differences were found between male and female students. However, there were significant gender differences in physics learning perceptions. Female participants tended to try to understand physics materials and relate the physics problems to real world situations while their male counterparts tended to rely on rote learning and equation application. This study found that participants performed better by trying to understand the physics material and relate physics problems to real world situations. Participants who relied on rote learning did not perform well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlicher, Bob G; Kulesz, James J; Abercrombie, Robert K
A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oakmore » Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .« less
Toward a Data Scalable Solution for Facilitating Discovery of Science Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Jesse R.; Castellana, Vito G.; Morari, Alessandro
Science is increasingly motivated by the need to process larger quantities of data. It is facing severe challenges in data collection, management, and processing, so much so that the computational demands of “data scaling” are competing with, and in many fields surpassing, the traditional objective of decreasing processing time. Example domains with large datasets include astronomy, biology, genomics, climate/weather, and material sciences. This paper presents a real-world use case in which we wish to answer queries pro- vided by domain scientists in order to facilitate discovery of relevant science resources. The problem is that the metadata for these science resourcesmore » is very large and is growing quickly, rapidly increasing the need for a data scaling solution. We propose a system – SGEM – designed for answering graph-based queries over large datasets on cluster architectures, and we re- port performance results for queries on the current RDESC dataset of nearly 1.4 billion triples, and on the well-known BSBM SPARQL query benchmark.« less
ACToR-Aggregated Computational Resource | Science ...
ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food & Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high throughput environmental chemical screening and prioritization program called ToxCast(TM).
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
Third World Science: Development Education through Science Teaching.
ERIC Educational Resources Information Center
Williams, Iolo Wyn
Third World Science (TWS) materials were developed to add a multicultural element to the existing science curriculum of 11-16-year-old students. TWS attempts to develop an appreciation of the: (1) boundless fascination of the natural world; (2) knowledge, skills, and expertise possessed by ordinary men and women everywhere; (3) application of…
Electronic access to ONREUR/ONRAISIA S and T reports
NASA Technical Reports Server (NTRS)
Mccluskey, William
1994-01-01
The Office of Naval Research maintains two foreign field offices in London, England and in Tokyo, Japan. These offices survey world-wide findings, trends and achievements in science and technology. These offices maintain liaison between U.S. Navy and foreign scientific research and development organizations conducting programs of naval interest. Expert personnel survey foreign scientific and technical activities, identify new directions and progress of potential interest, and report their findings. Report topics cover a broad range of basic scientific thrusts in mathematics, physics, chemistry, computer science, and oceanography, as well as advances in technologies such as electronics, materials, optics, and robotics. These unclassified reports will be made available via the Internet in 1995, replacing hard-copy publication.
Optimal hash arrangement of tentacles in jellyfish
NASA Astrophysics Data System (ADS)
Okabe, Takuya; Yoshimura, Jin
2016-06-01
At first glance, the trailing tentacles of a jellyfish appear to be randomly arranged. However, close examination of medusae has revealed that the arrangement and developmental order of the tentacles obey a mathematical rule. Here, we show that medusa jellyfish adopt the best strategy to achieve the most uniform distribution of a variable number of tentacles. The observed order of tentacles is a real-world example of an optimal hashing algorithm known as Fibonacci hashing in computer science.
EOSDIS: The Ultimate Earth Science Data Source for Research and Education
NASA Astrophysics Data System (ADS)
Agbu, P. A.; Chang, C.; Corprew, F. E.
2002-12-01
Today, there is compelling scientific evidence that human activities have attained the magnitude of a geological force and are speeding up the rates of global changes. For example, carbon dioxide levels have risen 30 percent since the industrial revolution and about 40 percent of the world's land surface has been transformed by humans. To assemble long-term information needed to construct accurate computer models that will enable forecasting of the causes and effects of climate change, the use of space-based Earth observing platforms is the only feasible way. Consequently, NASA's Earth Observing System (EOS) has begun an international study of planet Earth that is comprised of three main components: 1) a series of satellites specially designed to study the complexities of global change; 2) an advanced computer network for processing, storing, and distributing data (EOS Data and Information System); and 3) teams of scientists all over the world who will study the data. Recent launches of Landsat 7 in April 15, 1999 to continue the flow of global change information to users worldwide, and Terra the EOS flagship in December 18, 1999 to monitor climate and environmental change on Earth over the next 15 years, has tremendously expanded the sources of valuable Earth science data for research and education. These data and others from focused campaigns, e.g., FIFE and BOREAS designed to study surface-atmospheric interactions will be presented.
Engineering brain-computer interfaces: past, present and future.
Hughes, M A
2014-06-01
Electricity governs the function of both nervous systems and computers. Whilst ions move in polar fluids to depolarize neuronal membranes, electrons move in the solid-state lattices of microelectronic semiconductors. Joining these two systems together, to create an iono-electric brain-computer interface, is an immense challenge. However, such interfaces offer (and in select clinical contexts have already delivered) a method of overcoming disability caused by neurological or musculoskeletal pathology. To fulfill their theoretical promise, several specific challenges demand consideration. Rate-limiting steps cover a diverse range of disciplines including microelectronics, neuro-informatics, engineering, and materials science. As those who work at the tangible interface between brain and outside world, neurosurgeons are well placed to contribute to, and inform, this cutting edge area of translational research. This article explores the historical background, status quo, and future of brain-computer interfaces; and outlines the challenges to progress and opportunities available to the clinical neurosciences community.
An introduction to computer forensics.
Furneaux, Nick
2006-07-01
This paper provides an introduction to the discipline of Computer Forensics. With computers being involved in an increasing number, and type, of crimes the trace data left on electronic media can play a vital part in the legal process. To ensure acceptance by the courts, accepted processes and procedures have to be adopted and demonstrated which are not dissimilar to the issues surrounding traditional forensic investigations. This paper provides a straightforward overview of the three steps involved in the examination of digital media: Acquisition of data. Investigation of evidence. Reporting and presentation of evidence. Although many of the traditional readers of Medicine, Science and the Law are those involved in the biological aspects of forensics, I believe that both disciplines can learn from each other, with electronic evidence being more readily sought and considered by the legal community and the long, tried and tested scientific methods of the forensic community being shared and adopted by the computer forensic world.
Understanding Islamist political violence through computational social simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less
Dan Goldin Presentation: Pathway to the Future
NASA Technical Reports Server (NTRS)
1999-01-01
In the "Path to the Future" presentation held at NASA's Langley Center on March 31, 1999, NASA's Administrator Daniel S. Goldin outlined the future direction and strategies of NASA in relation to the general space exploration enterprise. NASA's Vision, Future System Characteristics, Evolutions of Engineering, and Revolutionary Changes are the four main topics of the presentation. In part one, the Administrator talks in detail about NASA's vision in relation to the NASA Strategic Activities that are Space Science, Earth Science, Human Exploration, and Aeronautics & Space Transportation. Topics discussed in this section include: space science for the 21st century, flying in mars atmosphere (mars plane), exploring new worlds, interplanetary internets, earth observation and measurements, distributed information-system-in-the-sky, science enabling understanding and application, space station, microgravity, science and exploration strategies, human mars mission, advance space transportation program, general aviation revitalization, and reusable launch vehicles. In part two, he briefly talks about the future system characteristics. He discusses major system characteristics like resiliencey, self-sufficiency, high distribution, ultra-efficiency, and autonomy and the necessity to overcome any distance, time, and extreme environment barriers. Part three of Mr. Goldin's talk deals with engineering evolution, mainly evolution in the Computer Aided Design (CAD)/Computer Aided Engineering (CAE) systems. These systems include computer aided drafting, computerized solid models, virtual product development (VPD) systems, networked VPD systems, and knowledge enriched networked VPD systems. In part four, the last part, the Administrator talks about the need for revolutionary changes in communication and networking areas of a system. According to the administrator, the four major areas that need cultural changes in the creativity process are human-centered computing, an infrastructure for distributed collaboration, rapid synthesis and simulation tools, and life-cycle integration and validation. Mr. Goldin concludes his presentation with the following maxim "Collaborate, Integrate, Innovate or Stagnate and Evaporate." He also answers some questions after the presentation.
Women's decision to major in STEM fields
NASA Astrophysics Data System (ADS)
Conklin, Stephanie
This paper explores the lived experiences of high school female students who choose to enter into STEM fields, and describes the influencing factors which steered these women towards majors in computer science, engineering and biology. Utilizing phenomenological methodology, this study seeks to understand the essence of women's decisions to enter into STEM fields and further describe how the decision-making process varies for women in high female enrollment fields, like biology, as compared with low enrollment fields like, computer science and engineering. Using Bloom's 3-Stage Theory, this study analyzes how relationships, experiences and barriers influenced women towards, and possibly away, from STEM fields. An analysis of women's experiences highlight that support of family, sustained experience in a STEM program during high school as well as the presence of an influential teacher were all salient factors in steering women towards STEM fields. Participants explained that influential teacher worked individually with them, modified and extended assignments and also steered participants towards coursework and experiences. This study also identifies factors, like guidance counselors as well as personal challenges, which inhibited participant's path to STEM fields. Further, through analyzing all six participants' experiences, it is clear that a linear model, like Bloom's 3-Stage Model, with limited ability to include potential barriers inhibited the ability to capture the essence of each participant's decision-making process. Therefore, a revised model with no linear progression which allows for emerging factors, like personal challenges, has been proposed; this model focuses on how interest in STEM fields begins to develop and is honed and then mastered. This study also sought to identify key differences in the paths of female students pursuing different majors. The findings of this study suggest that the path to computer science and engineering is limited. Computer science majors faced few, if any, challenges, hoped to use computers as a tool to innovate and also participated in the same computer science program. For female engineering students, the essence of their experience focused on interaction at a young age with an expert in an engineering-related field as well as a strong desire to help solve world problems using engineering. These participants were able to articulate clearly future careers. In contrast, biology majors, faced more challenges and were undecided about their future career goals. These results suggest that a longitudinal study focused on women pursuing engineering and computer science fields is warranted; this will hopefully allow these findings to be substantiated and also for refinement of the revised theoretical model.
A review and exploration of sociotechnical ergonomics.
Dirkse van Schalkwyk, Riaan; Steenkamp, Rigard J
2017-09-01
A holistic review of ergonomic history shows that science remains important for general occupational health and safety (OSH), the broad society, culture, politics and the design of everyday things. Science provides an unconventional and multifaceted viewpoint exploring ergonomics from a social, corporate and OSH perspective. Ergonomic solutions from this mindset may redefine the science, and it will change with companies that change within this socially hyper-connected world. Authentic corporate social responsibility will counter 'misleadership' by not approaching ergonomics with an afterthought. The review concludes that ergonomics will be stronger with social respect and ergonomic thinking based on the optimisation of anthropometric data, digital human models, computer-aided tools, self-empowerment, job enrichment, work enlargement, physiology, industrial psychology, cybernetic ergonomics, operations design, ergonomic-friendly process technologies, ergonomic empowerment, behaviour-based safety, outcome-based employee wellness and fatigue risk management solutions, to mention a few.
NASA Astrophysics Data System (ADS)
Takahashi, N.; Agata, H.; Maeda, K.; Okyudo, M..; Yamazaki, Y.
A total solar eclipse was observed on 2001 June 21 in Angola, Zambia, and Zimbabwe in Africa. For the purpose of promotion of science education using a solar eclipse as an educational project, the whole image and an enlarged image of the Sun, that showed the process of an eclipse and how things went in the observation area, were broadcast to the world through the Internet (Live Eclipse). Such images were distributed to four primary schools in Hiroshima and the Science and Technology Museum in Tokyo to give a remote lecture through computers. To find the effectiveness of the lecture, the learning effect on the participating children was examined two times before and after the remote lecture on the solar eclipse.
Hands-On Astrophysics: Variable Stars in Math, Science, and Computer Education
NASA Astrophysics Data System (ADS)
Mattei, J. A.; Percy, J. R.
1999-12-01
Hands-On Astrophysics (HOA): Variable Stars in Math, Science, and Computer Education, is a project recently developed by the American Association of Variable Star Observers (AAVSO) with funds from the National Science Foundation. HOA uses the unique methods and the international database of the AAVSO to develop and integrate students' math and science skills through variable star observation and analysis. It can provide an understanding of basic astronomy concepts, as well as interdisciplinary connections. Most of all, it motivates the user by exposing them to the excitement of doing real science with real data. Project materials include: a database of 600,000 variable star observations; VSTAR (a data plotting and analysis program), and other user friendly software; 31 slides and 14 prints of five constellations; 45 variable star finder charts; an instructional videotape in three 15-minute segments; and a 560-page student's and teacher's manual. These materials support the National Standards for Science and Math education by directly involving the students in the scientific process. Hands-On Astrophysics is designed to be flexible. It is organized so that it can be used at many levels, in many contexts: for classroom use from high school to college level, or for individual projects. In addition, communication and support can be found through the AAVSO home page on the World Wide Web: http://www.aavso.org. The HOA materials can be ordered through this web site or from the AAVSO, 25 Birch Street Cambridge, MA 02138, USA. We gratefully acknowledge the education grant ESI-9154091 from the National Science Foundation which funded the development of this project.
New modalities for scientific engagement in Africa - the case for computational physics
NASA Astrophysics Data System (ADS)
Chetty, N.
2011-09-01
Computational physics as a mode of studying the mathematical and physical sciences has grown world-wide over the past two decades, but this trend is yet to fully develop in Africa. The essential ingredients are there for this to happen: increasing internet connectivity, cheaper computing resources and the widespread availability of open source and freeware. The missing ingredients centre on intellectual isolation and the low levels of quality international collaborations. Low level of funding for research from local governments remains a critical issue. This paper gives a motivation for the importance of developing computational physics at the university undergraduate level, graduate level and research levels and gives suggestions on how this may be achieved within the African context. It is argued that students develop a more intuitive feel for the mathematical and physical sciences, that they learn useful, transferable skills that make our graduates well-sought after in the industrial and commercial environments, and that such graduates are better prepared to tackle research problems at the masters and doctoral levels. At the research level, the case of the African School Series on Electronic Structure Methods and Applications (ASESMA) is presented as a new multi-national modality for engaging with African scientists. There are many novel aspects to this School series, which are discussed.
Mitchell, E; Sullivan, F
2001-02-03
To appraise findings from studies examining the impact of computers on primary care consultations. Systematic review of world literature from 1980 to 1997. 5475 references were identified from electronic databases (Medline, Science Citation Index, Social Sciences Citation Index, Index of Scientific and Technical Proceedings, Embase, OCLC FirstSearch Proceedings), bibliographies, books, identified articles, and by authors active in the field. 1892 eligible abstracts were independently rated, and 89 studies met the inclusion criteria. Effect on doctors' performance and patient outcomes; attitudes towards computerisation. 61 studies examined effects of computers on practitioners' performance, 17 evaluated their impact on patient outcome, and 20 studied practitioners' or patients' attitudes. Computer use during consultations lengthened the consultation. Reminder systems for preventive tasks and disease management improved process rates, although some returned to pre-intervention levels when reminders were stopped. Use of computers for issuing prescriptions increased prescribing of generic drugs, and use of computers for test ordering led to cost savings and fewer unnecessary tests. There were no negative effects on those patient outcomes evaluated. Doctors and patients were generally positive about use of computers, but issues of concern included their impact on privacy, the doctor-patient relationship, cost, time, and training needs. Primary care computing systems can improve practitioner performance, particularly for health promotion interventions. This may be at the expense of patient initiated activities, making many practitioners suspicious of the negative impact on relationships with patients. There remains a dearth of evidence evaluating effects on patient outcomes.
Mitchell, Elizabeth; Sullivan, Frank
2001-01-01
Objectives To appraise findings from studies examining the impact of computers on primary care consultations. Design Systematic review of world literature from 1980 to 1997. Data sources 5475 references were identified from electronic databases (Medline, Science Citation Index, Social Sciences Citation Index, Index of Scientific and Technical Proceedings, Embase, OCLC FirstSearch Proceedings), bibliographies, books, identified articles, and by authors active in the field. 1892 eligible abstracts were independently rated, and 89 studies met the inclusion criteria. Main outcome measures Effect on doctors' performance and patient outcomes; attitudes towards computerisation. Results 61 studies examined effects of computers on practitioners' performance, 17 evaluated their impact on patient outcome, and 20 studied practitioners' or patients' attitudes. Computer use during consultations lengthened the consultation. Reminder systems for preventive tasks and disease management improved process rates, although some returned to pre-intervention levels when reminders were stopped. Use of computers for issuing prescriptions increased prescribing of generic drugs, and use of computers for test ordering led to cost savings and fewer unnecessary tests. There were no negative effects on those patient outcomes evaluated. Doctors and patients were generally positive about use of computers, but issues of concern included their impact on privacy, the doctor-patient relationship, cost, time, and training needs. Conclusions Primary care computing systems can improve practitioner performance, particularly for health promotion interventions. This may be at the expense of patient initiated activities, making many practitioners suspicious of the negative impact on relationships with patients. There remains a dearth of evidence evaluating effects on patient outcomes. PMID:11157532
Exploring virtual worlds with head-mounted displays
NASA Astrophysics Data System (ADS)
Chung, James C.; Harris, Mark R.; Brooks, F. P.; Fuchs, Henry; Kelley, Michael T.
1989-02-01
Research has been conducted in the use of simple head mounted displays in real world applications. Such units provide the user with non-holographic true 3-D information, since the kinetic depth effect, stereoscopy, and other visual cues combine to immerse the user in a virtual world which behaves like the real world in some respects. UNC's head mounted display was built inexpensively from commercially available off-the-shelf components. Tracking of the user's head position and orientation is performed by a Polhemus Navigation Sciences' 3SPACE tracker. The host computer uses the tracking information to generate updated images corresponding to the user's new left eye and right eye views. The images are broadcast to two liquid crystal television screens (220x320 pixels) mounted on a horizontal shelf at the user's forehead. The user views these color screens through half-silvered mirrors, enabling the computer generated image to be superimposed upon the user's real physical environment. The head mounted display was incorporated into existing molecular and architectural applications being developed at UNC. In molecular structure studies, chemists are presented with a room sized molecule with which they can interact in a manner more intuitive than that provided by conventional 2-D displays and dial boxes. Walking around and through the large molecule may provide quicker understanding of its structure, and such problems as drug enzyme docking may be approached with greater insight.
El Programa de Fortalecimiento de Capacidades de COSPAR
NASA Astrophysics Data System (ADS)
Gabriel, C.
2016-08-01
The provision of scientific data archives and analysis tools by diverse institutions in the world represents a unique opportunity for the development of scientific activities. An example of this is the European Space Agency's space observatory XMM-Newton with its Science Operations Centre at the European Space Astronomy Centre near Madrid, Spain. It provides through its science archive and web pages, not only the raw and processed data from the mission, but also analysis tools, and full documentation greatly helping their dissemination and use. These data and tools, freely accesible to anyone in the world, are the practical elements around which COSPAR (COmmittee on SPAce Research) Capacity Building Workshops have been conceived and developed, and held for a decade and a half in developing countries. The Programme started with X-ray workshops, but in-between it has been broadened to the most diverse space science areas. The workshops help to develop science at the highest level in those countries, in a long and substainable way, with a minimal investment (computer plus a moderate Internet connection). In this paper we discuss the basis, concepts, and achievements of the Capacity Building Programme. Two instances of the Programme have already taken place in Argentina, one of them devoted to X-ray astronomy and another to Infrared Astronomy. Several others have been organised for the Latin American region (Brazil, Uruguay and Mexico) with a large participation of young investigators from Argentina.
The (human) science of medical virtual learning environments.
Stone, Robert J
2011-01-27
The uptake of virtual simulation technologies in both military and civilian surgical contexts has been both slow and patchy. The failure of the virtual reality community in the 1990s and early 2000s to deliver affordable and accessible training systems stems not only from an obsessive quest to develop the 'ultimate' in so-called 'immersive' hardware solutions, from head-mounted displays to large-scale projection theatres, but also from a comprehensive lack of attention to the needs of the end users. While many still perceive the science of simulation to be defined by technological advances, such as computing power, specialized graphics hardware, advanced interactive controllers, displays and so on, the true science underpinning simulation--the science that helps to guarantee the transfer of skills from the simulated to the real--is that of human factors, a well-established discipline that focuses on the abilities and limitations of the end user when designing interactive systems, as opposed to the more commercially explicit components of technology. Based on three surgical simulation case studies, the importance of a human factors approach to the design of appropriate simulation content and interactive hardware for medical simulation is illustrated. The studies demonstrate that it is unnecessary to pursue real-world fidelity in all instances in order to achieve psychological fidelity--the degree to which the simulated tasks reproduce and foster knowledge, skills and behaviours that can be reliably transferred to real-world training applications.
With Great Measurements Come Great Results
NASA Astrophysics Data System (ADS)
Williams, Carl
Measurements are the foundation for science and modern life. Technologies we take for granted every day depend on them-cell phones, CAT scans, pharmaceuticals, even sports equipment. Metrology, or measurement science, determines what industry can make reliably and what they cannot. At the National Institute of Standards and Technology (NIST) we specialize in making world class measurements that an incredibly wide range of industries use to continually improve their products - computer chips with nanoscale components, atomic clocks that you can hold in your hand, lasers for both super-strong welds and delicate eye surgeries. Think of all the key technologies developed over the last 100 years and better measurements, standards, or analysis techniques played a role in making them possible. NIST works collaboratively with industry researchers on the advanced metrology for tomorrow's technologies. A new kilogram based on electromagnetic force, cars that weigh half as much but are just as strong, quantum computers, personalized medicine, single atom devices - it's all happening in our labs now. This talk will focus on how metrology creates the future.
Changing how and what children learn in school with computer-based technologies.
Roschelle, J M; Pea, R D; Hoadley, C M; Gordin, D N; Means, B M
2000-01-01
Schools today face ever-increasing demands in their attempts to ensure that students are well equipped to enter the workforce and navigate a complex world. Research indicates that computer technology can help support learning, and that it is especially useful in developing the higher-order skills of critical thinking, analysis, and scientific inquiry. But the mere presence of computers in the classroom does not ensure their effective use. Some computer applications have been shown to be more successful than others, and many factors influence how well even the most promising applications are implemented. This article explores the various ways computer technology can be used to improve how and what children learn in the classroom. Several examples of computer-based applications are highlighted to illustrate ways technology can enhance how children learn by supporting four fundamental characteristics of learning: (1) active engagement, (2) participation in groups, (3) frequent interaction and feedback, and (4) connections to real-world contexts. Additional examples illustrate ways technology can expand what children learn by helping them to understand core concepts in subjects like math, science, and literacy. Research indicates, however, that the use of technology as an effective learning tool is more likely to take place when embedded in a broader education reform movement that includes improvements in teacher training, curriculum, student assessment, and a school's capacity for change. To help inform decisions about the future role of computers in the classroom, the authors conclude that further research is needed to identify the uses that most effectively support learning and the conditions required for successful implementation.
Laboratory Directed Research and Development Annual Report for 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Pamela J.
This report documents progress made on all LDRD-funded projects during fiscal year 2009. As a US Department of Energy (DOE) Office of Science (SC) national laboratory, Pacific Northwest National Laboratory (PNNL) has an enduring mission to bring molecular and environmental sciences and engineering strengths to bear on DOE missions and national needs. Their vision is to be recognized worldwide and valued nationally for leadership in accelerating the discovery and deployment of solutions to challenges in energy, national security, and the environment. To achieve this mission and vision, they provide distinctive, world-leading science and technology in: (1) the design and scalablemore » synthesis of materials and chemicals; (2) climate change science and emissions management; (3) efficient and secure electricity management from generation to end use; and (4) signature discovery and exploitation for threat detection and reduction. PNNL leadership also extends to operating EMSL: the Environmental Molecular Sciences Laboratory, a national scientific user facility dedicated to providing itnegrated experimental and computational resources for discovery and technological innovation in the environmental molecular sciences.« less
Global Systems Science: A New World View
NASA Technical Reports Server (NTRS)
Sneider, Cary; Golden, Richard; Barrett, Katharine
1999-01-01
Global systems science is a new field of study about the interactions between Earth's natural systems and human activities. The people who study global systems science draw on methods and theories of many different fields from chemistry and biology to economics and politics-in order to predict how today's actions are likely to affect the world of tomorrow - our world and our children's world.
NASA Astrophysics Data System (ADS)
Delello, Julie Anne
2009-12-01
In 1999, the Chinese Academy of Sciences realized that there was a need for a better public understanding of science. For the public to have better accessibility and comprehension of China's significance to the world, the Computer Network Information Center (CNIC), under the direction of the Chinese Academy of Sciences, combined resources from thousands of experts across the world to develop online science exhibits housed within the Virtual Science Museum of China. Through an analysis of historical documents, this descriptive dissertation presents a research project that explores a dimension of the development of the Giant Panda Exhibit. This study takes the reader on a journey, first to China and then to a classroom within the United States, in order to answer the following questions: (1) What is the process of the development of a virtual science exhibit; and, (2) What role do public audiences play in the design and implementation of virtual science museums? The creation of a virtual science museum exhibition is a process that is not completed with just the building and design, but must incorporate feedback from public audiences who utilize the exhibit. To meet the needs of the museum visitors, the designers at CNIC took a user-centered approach and solicited feedback from six survey groups. To design a museum that would facilitate a cultural exchange of scientific information, the CNIC looked at the following categories: visitor insights, the usability of the technology, the educational effectiveness of the museum exhibit, and the cultural nuances that existed between students in China and in the United States. The findings of this study illustrate that the objectives of museum designers may not necessarily reflect the needs of the visitors and confirm previous research studies which indicate that museum exhibits need a more constructivist approach that fully engages the visitor in an interactive, media-rich environment. Even though the world has moved forwards with digital technology, classroom instruction in both China and in the United States continues to reflect traditional teaching methods. Students were shown to have a lack of experience with the Internet in classrooms and difficulty in scientific comprehension when using the virtual science museum---showing a separation between classroom technology and learning. Students showed a greater interest level in learning science with technology through online gaming and rich multimedia suggesting that virtual science museums can be educationally valuable and support an alternative to traditional teaching methods if designed with the end user in mind.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langer, S; Rotman, D; Schwegler, E
The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less
A qualitative study of technophobic students' reactions to a technology-rich college science course
NASA Astrophysics Data System (ADS)
Guttschow, Gena Lee
The use of technology in education has grown rapidly in the last 20 years. In fact, many of today's college students have had some sort of computer in their elementary school classrooms. One might think that this consistent exposure to computers would foster positive attitudes about computers but this is not always the case. Currently, a substantial number of college students dislike interacting with technology. People who dislike interacting with technology are often referred to as "technophobic". Technophobic people have negative thoughts and feelings about technology and they often have a desire to avoid interaction with technology. Technophobic students' negative feelings about technology have the potential to interfere with their learning when technology is utilized as a tool for instruction of school subjects. As computer use becomes prevalent and in many instances mandatory in education, the issue of technophobia increasingly needs to be understood and addressed. This is a qualitative study designed with the intent of gaining an understanding the experiences of technophobic students who are required to use technology to learn science in a college class. Six developmental college students enrolled in a computer based anatomy and physiology class were chosen to participate in the study based on their high technophobia scores. They were interviewed three times during the quarter and videotaped once. The interview data were transcribed, coded, and analyzed. The analysis resulted in six case studies describing each participant's experience and 11 themes representing overlapping areas in the participants' worlds of experience. A discussion of the themes, the meaning they hold for me as a science educator and how they relate to the existing literature, is presented. The participants' descriptions of their experiences showed that the technophobic students did use the computers and learned skills when they had to in order to complete assignments. It was also revealed that the technophobic participants' negative attitudes did not improve after learning computer skills. Lastly, based on the participants' experiences it seems important to start a class with step-by step computer training, teaching foundational computer skills, and slowly progress towards autonomous computer exploration.
NASA Technical Reports Server (NTRS)
Brooks, Frederick P., Jr.
1991-01-01
The utility of virtual reality computer graphics in telepresence applications is not hard to grasp and promises to be great. When the virtual world is entirely synthetic, as opposed to real but remote, the utility is harder to establish. Vehicle simulators for aircraft, vessels, and motor vehicles are proving their worth every day. Entertainment applications such as Disney World's StarTours are technologically elegant, good fun, and economically viable. Nevertheless, some of us have no real desire to spend our lifework serving the entertainment craze of our sick culture; we want to see this exciting technology put to work in medicine and science. The topics covered include the following: testing a force display for scientific visualization -- molecular docking; and testing a head-mounted display for scientific and medical visualization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D E; McGraw, J R
2006-02-02
Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less
War of Ontology Worlds: Mathematics, Computer Code, or Esperanto?
Rzhetsky, Andrey; Evans, James A.
2011-01-01
The use of structured knowledge representations—ontologies and terminologies—has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies. PMID:21980276
War of ontology worlds: mathematics, computer code, or Esperanto?
Rzhetsky, Andrey; Evans, James A
2011-09-01
The use of structured knowledge representations-ontologies and terminologies-has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies.
Pre-service Teachers Learn the Nature of Science in Simulated Worlds
NASA Astrophysics Data System (ADS)
Marshall, Jill
2007-10-01
Although the Texas Essential Knowledge and Skills include an understanding of the nature of science as an essential goal of every high school science course, few students report opportunities to explore essential characteristics of science in their previous classes. A simulated-world environment (Erickson, 2005) allows students to function as working scientists and discover these essential elements for themselves (i.e. that science is evidence-based and involves testable conjectures, that theories have limitations and are constantly being modified based on new discoveries to more closely reflect the natural world.) I will report on pre-service teachers' exploration of two simulated worlds and resulting changes in their descriptions of the nature of science. Erickson (2005). Simulating the Nature of Science. Presentation at the 2005 Summer AAPT Meeting, Salt Lake City, UT.
NASA Astrophysics Data System (ADS)
Khan, Aafaque; Sridhar, Apoorva
2012-07-01
The previous decade saw the emergence of internet in the new avatar popularly known as Web 2.0. After its inception, Internet (also known as Web 1.0) remained centralized and propriety controlled; the information was displayed in form of static pages and users could only browse through these pages connected via URLs (Unique Resource Locator), links and search engines. Web 2.0, on the other hand, has features and tools that allow users to engage in dialogue, interact and contribute to the content on the World Wide Web. As a Result, Social Media has become the most widely accepted medium of interactive and participative dialogue around the world. Social Media is not just limited to Social Networking; it extends from podcasts, webcasts, blogs, micro-blogs, wikis, forums to crowd sourcing, cloud storage, cloud computing and Voice over Internet Protocol. World over, there is a rising trend of using Social Media for Space Education and Outreach. Governments, Space Agencies, Universities, Industry and Organizations have realized the power of Social Media to communicate advancement of space science and technology, updates on space missions and their findings to the common man as well as to the researchers, scientists and experts around the world. In this paper, the authors intend to discuss, the perspectives, of young students and professionals in the space industry on various present and future possibilities of using Social Media in space outreach and citizen science, especially in India and other developing countries. The authors share a vision for developing Social Media platforms to communicate space science and technology, along innovative ideas on participative citizen science projects for various space based applications such as earth observation and space science. Opinions of various young students and professionals in the space industry from different parts of the world are collected and reflected through a comprehensive survey. Besides, a detailed study and review with various examples of present existing projects such as Open NASA, Zooniverse, SETI, Google Earth etc. Support these perspectives. Further, the authors put light on how developing countries can benefit from Space outreach and citizen science through Social Media to connect with the society. The paper concludes with various innovative ideas that are derived from the survey and discussions with these prospective space leaders, along with the insights of the authors on future strategies for such approaches in India and other developing nations. Demographically, youth provides the largest user-base to the Social Media and these young future space leaders are expert at using Social Media in their daily life. Thus, it is important that their collective and shared opinion is presented to the present policymakers and leaders of space agencies and industry.
Neutron imaging data processing using the Mantid framework
NASA Astrophysics Data System (ADS)
Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried
2016-09-01
Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.
Towards Test Driven Development for Computational Science with pFUnit
NASA Technical Reports Server (NTRS)
Rilee, Michael L.; Clune, Thomas L.
2014-01-01
Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.
NASA Astrophysics Data System (ADS)
Fairley, J. P.; Hinds, J. J.
2003-12-01
The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.
Dental Care. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Natural Dyes. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Distillation. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Methane Digestors. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Clay Pots. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Energy Convertors. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Science Across the World in Teacher Training
ERIC Educational Resources Information Center
Schoen, Lida; Weishet, Egbert; Kennedy, Declan
2007-01-01
Science Across the World is an exchange programme between schools world-wide. It has two main components: existing resources for students (age 6-10) and a database with all participating schools. The programme exists since 1990. It is carried out in partnership with the British Association of Science Education (ASE) and international…
Iron Smelting. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Fermentation. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
An urban area minority outreach program for K-6 children in space science
NASA Astrophysics Data System (ADS)
Morris, P.; Garza, O.; Lindstrom, M.; Allen, J.; Wooten, J.; Sumners, C.; Obot, V.
The Houston area has minority populations with significant school dropout rates. This is similar to other major cities in the United States and elsewhere in the world where there are significant minority populations from rural areas. The student dropout rates are associated in many instances with the absence of educational support opportuni- ties either from the school and/or from the family. This is exacerbated if the student has poor English language skills. To address this issue, a NASA minority university initiative enabled us to develop a broad-based outreach program that includes younger children and their parents at a primarily Hispanic inner city charter school. The pro- gram at the charter school was initiated by teaching computer skills to the older chil- dren, who in turn taught parents. The older children were subsequently asked to help teach a computer literacy class for mothers with 4-5 year old children. The computers initially intimidated the mothers as most had limited educational backgrounds and En- glish language skills. To practice their newly acquired computer skills and learn about space science, the mothers and their children were asked to pick a space project and investigate it using their computer skills. The mothers and their children decided to learn about black holes. The project included designing space suits for their children so that they could travel through space and observe black holes from a closer proxim- ity. The children and their mothers learned about computers and how to use them for educational purposes. In addition, they learned about black holes and the importance of space suits in protecting astronauts as they investigated space. The parents are proud of their children and their achievements. By including the parents in the program, they have a greater understanding of the importance of their children staying in school and the opportunities for careers in space science and technology. For more information on our overall program, the charter school and their other space science related activities, visit their web site, http://www.tccc-ryss.org/solarsys/solarmingrant.htm
Computational Infrastructure for Geodynamics (CIG)
NASA Astrophysics Data System (ADS)
Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.
2004-12-01
Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.
Physical Analytics: An emerging field with real-world applications and impact
NASA Astrophysics Data System (ADS)
Hamann, Hendrik
2015-03-01
In the past most information on the internet has been originated by humans or computers. However with the emergence of cyber-physical systems, vast amount of data is now being created by sensors from devices, machines etc digitizing the physical world. While cyber-physical systems are subject to active research around the world, the vast amount of actual data generated from the physical world has attracted so far little attention from the engineering and physics community. In this presentation we use examples to highlight the opportunities in this new subject of ``Physical Analytics'' for highly inter-disciplinary research (including physics, engineering and computer science), which aims understanding real-world physical systems by leveraging cyber-physical technologies. More specifically, the convergence of the physical world with the digital domain allows applying physical principles to everyday problems in a much more effective and informed way than what was possible in the past. Very much like traditional applied physics and engineering has made enormous advances and changed our lives by making detailed measurements to understand the physics of an engineered device, we can now apply the same rigor and principles to understand large-scale physical systems. In the talk we first present a set of ``configurable'' enabling technologies for Physical Analytics including ultralow power sensing and communication technologies, physical big data management technologies, numerical modeling for physical systems, machine learning based physical model blending, and physical analytics based automation and control. Then we discuss in detail several concrete applications of Physical Analytics ranging from energy management in buildings and data centers, environmental sensing and controls, precision agriculture to renewable energy forecasting and management.
NASA Astrophysics Data System (ADS)
Pazmino, J.
2012-06-01
(Abstract only) New York City in the late 20th century rose to be a planetary capital for the sciences, not just astronomy. This growth was mainly in the academic sector but a parallel growth occurred in the public and home field. With the millennium crossing, scientists in New York agitated for a celebration of the City as a place for a thriving science culture. In 2008 they began World Science Festival. 2011 is the fourth running, on June 1-5, following the AAVSO/AAS meetings. World Science Festival was founded by Dr. Brian Greene, Columbia University, and is operated through the World Science Foundation. The Festival is "saturation science" all over Manhattan in a series of lectures, shows, exhibits, performances. It is staged in "science" venues like colleges and musea, but also in off-science spaces like theaters and galleries. It is a blend from hard science, with lectures like those by us astronomers, to science-themed works of art, dance, music. Events are fitted for the public, either for free or a modest fee. While almost all events are on Manhattan, effort has been made to geographically disperse them, even to the outer boroughs. The grand finale of World Science Festival is a street fair in Washington Square. Science centers in booths, tents, and pavilions highlight their work. In past years this fair drew 100,000 to 150,000 visitors. The entire Festival attracts about a quarter-million attendees. NYSkies is a proud participant at the Washington Square fair. It interprets the "Earth to the Universe" display, debuting during IYA-2009. Attendance at "Earth..." on just the day of the fair plausibly is half of all visitors in America. The presentation shows the scale and scope of World Science Festival, its relation to the City, and how our astronomers work with it.
NASA Astrophysics Data System (ADS)
Pazmino, John
2011-05-01
New York City in the late 20th century rose to be a planetary capital for the sciences, not just astronomy. This growth is mainly in the academic sector but a parallel growth occurred in the public and home field. With the millennium crossing scientists in New York agitated for a celebration of the City as a place for a thriving science culture. In 2008 they began World Science Festival. 2011 is the fourth running, on June 1st-5th, following AAVSO/AAS. World Science Festival was founded by Dr Brian Greene, Columbia University, and is operated thru World Science Foundation. The Festival is 'saturation science' all over Manhattan in a series of lectures, shows, exhibits, performances. It is staged in 'science' venues like colleges and musea, but also in off-science spaces like theaters and galleries. It is a blend of hard science, with lectures like those by us astronomers to science-themed works of art, dance, music. Events are fitted for the public, either for free or a modest fee. While almost all events are on Manhattan, effort is done to geographicly disperse them, even to the outer boros. The grand finale of World Science Festival is a street fair in Washington Square. Science centers in booths, tents, pavilions highlight their work. This fair drew in past years 100,000 to 150,000 visitors. The entire Festival attracts about a quarter million. NYSkies is a proud participant at the Washington Square fair. It interprets the 'Earth to the Universe' display, debuting during IYA-2009. Attendance at 'Earth ...' on just the day of the fair plausibly is half of all visitors in America. The presentation shows the scale and scope of World Science Festival, its relation to the City, and how our astronomers work with it.
The Path from Large Earth Science Datasets to Information
NASA Astrophysics Data System (ADS)
Vicente, G. A.
2013-12-01
The NASA Goddard Earth Sciences Data (GES) and Information Services Center (DISC) is one of the major Science Mission Directorate (SMD) for archiving and distribution of Earth Science remote sensing data, products and services. This virtual portal provides convenient access to Atmospheric Composition and Dynamics, Hydrology, Precipitation, Ozone, and model derived datasets (generated by GSFC's Global Modeling and Assimilation Office), the North American Land Data Assimilation System (NLDAS) and the Global Land Data Assimilation System (GLDAS) data products (both generated by GSFC's Hydrological Sciences Branch). This presentation demonstrates various tools and computational technologies developed in the GES DISC to manage the huge volume of data and products acquired from various missions and programs over the years. It explores approaches to archive, document, distribute, access and analyze Earth Science data and information as well as addresses the technical and scientific issues, governance and user support problem faced by scientists in need of multi-disciplinary datasets. It also discusses data and product metrics, user distribution profiles and lessons learned through interactions with the science communities around the world. Finally it demonstrates some of the most used data and product visualization and analyses tools developed and maintained by the GES DISC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
March, N.B.; Bishop, G.
1994-12-31
Georgia school teachers served eight to ten day internships as research colleagues on St. Catherine`s island, Georgia. Interns monitored daily nesting activity, evaluated possible nests, validated egg chambers, screened the nests, and monitored each nest daily and assessed hatching success by excavation upon emergence of hatchlings. The real-world, hands-on holistic field experience immersed school teachers in the problems of executing a natural history conservation project integrating scientific content and methodology, mathematical analysis, and computer documentation. Outcomes included increased scientific inquiry, reduced science anxiety, heightened self-confidence, and enhanced credibility with students and colleagues. This educational model is applicable to many areasmore » and problems.« less
Comedy, Yolanda L.; Gilbert, Juan E.; Pun, Suzie H.
2017-01-01
Inventors help solve all kinds of problems. The AAAS-Lemelson Invention Ambassador program celebrates inventors who have an impact on global challenges, making our communities and the globe better, one invention at a time. In this paper, we introduce two of these invention ambassadors: Dr. Suzie Pun and Dr. Juan Gilbert. Dr. Suzie Pun is the Robert F. Rushmer Professor of Bioengineering, an adjunct professor of chemical engineering, and a member of the Molecular Engineering and Sciences Institute at the University of Washington. Dr. Juan Gilbert is the Andrew Banks Family Preeminence Endowed Professor and chair of the Computer & Information Science & Engineering Department at the University of Florida. Both have a passion for solving problems and are dedicated to teaching their students to change the world. PMID:29527271
NASA Astrophysics Data System (ADS)
Barry, Edward
2010-02-01
Interdisciplinary science has been a hot topic for more than a decade, with increasing numbers of researchers working on projects that do not fit into neat departmental boxes like "physics" or "biology". Yet despite this increased activity, the structures in place to support these interdisciplinary scientists - including research grants and training for PhD students - have sometimes lagged behind. One programme that aims to help fill this gap for students of biomedical, physical and computational sciences is the Interfaces Initiative, a joint project of the Howard Hughes Medical Institute and the US National Institute of Biomedical Imaging and Bioengineering. Physics World talked to a current Interfaces participant, Edward Barry, who is finishing his PhD in biology-related condensed-matter physics at Brandeis University in Massachusetts.
A guide to the National Space Science Data Center
NASA Technical Reports Server (NTRS)
1990-01-01
This is the second edition of a document that was published to acquaint space and Earth research scientists with an overview of the services offered by the NSSDC. As previously stated, the NSSDC was established by NASA to be the long term archive for data from its space missions. However, the NSSDC has evolved into an organization that provides a multitude of services for scientists throughout the world. Brief articles are presented which discuss these services. At the end of each article is the name, address, and telephone number of the person to contact for additional information. Online Information and Data Systems, Electronic Access, Offline Data Archive, Value Added Services, Mass Storage Activities, and Computer Science Research are all detailed.
Welch, M C; Kwan, P W; Sajeev, A S M
2014-10-01
Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.
Charlton, Bruce G
2007-01-01
The four science Nobel prizes (physics, chemistry, medicine/physiology and economics) have performed extremely well as a method of recognizing the highest level of achievement. The prizes exist primarily to honour individuals but also have a very important function in science generally. In particular, the institutions and nations which have educated, nurtured or supported many Nobel laureates can be identified as elite in world science. However, the limited range of subjects and a maximum of 12 laureates per year mean that many major scientific achievements remain un-recognized; and relatively few universities can gather sufficient Nobel-credits to enable a precise estimate of their different levels of quality. I advocate that the Nobel committee should expand the number of Nobel laureates and Prize categories as a service to world science. (1) There is a large surplus of high quality prize candidates deserving of recognition. (2) There has been a vast expansion of research with a proliferation of major sub-disciplines in the existing categories. (3) Especially, the massive growth of the bio-medical sciences has created a shortage of Nobel recognition in this area. (4) Whole new fields of major science have emerged. I therefore suggest that the maximum of three laureates per year should always be awarded in the categories of physics, chemistry and economics, even when these prizes are for diverse and un-related achievements; that the number of laureates in the 'biology' category of physiology or medicine should be increased to six or preferably nine per year; and that two new Prize categories should be introduced to recognize achievements in mathematics and computing science. Together, these measures could increase the science laureates from a maximum of 12 to a minimum of 24, and increase the range of scientific coverage. In future, the Nobel committee should also officially allocate proportionate credit to institutions for each laureate, and a historical task force could also award institutional credit for past prizes.
2011-11-01
Presents Arthur C. Graesser as the 2011 winner of the American Psychological Association Award for Distinguished Contributions of Applications of Psychology to Education and Training. "As a multifaceted psychologist, cognitive engineer of useful education and training technologies, and mentor of new talent for the world of applied and translational cognitive science, Arthur C. Graesser is the perfect role model, showing how a strong scholar and intellect can shape both research and practice. His work is a mix of top-tier scholarship in psychology, education, intelligent systems, and computational linguistics. He combines cognitive science excellence with bold use of psychological knowledge and intelligent systems to design new generations of learning opportunities and to help lay the foundation for a translational science of learning." (PsycINFO Database Record (c) 2011 APA, all rights reserved). 2011 APA, all rights reserved
Data issues in the life sciences.
Thessen, Anne E; Patterson, David J
2011-01-01
We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the "Big New Biology". Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available.
Data issues in the life sciences
Thessen, Anne E.; Patterson, David J.
2011-01-01
Abstract We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the “Big New Biology”. Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available. PMID:22207805
Generating Mosaics of Astronomical Images
NASA Technical Reports Server (NTRS)
Bergou, Attila; Berriman, Bruce; Good, John; Jacob, Joseph; Katz, Daniel; Laity, Anastasia; Prince, Thomas; Williams, Roy
2005-01-01
"Montage" is the name of a service of the National Virtual Observatory (NVO), and of software being developed to implement the service via the World Wide Web. Montage generates science-grade custom mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. "Science-grade" in this context signifies that terrestrial and instrumental features are removed from images in a way that can be described quantitatively. "Custom" refers to user-specified parameters of projection, coordinates, size, rotation, and spatial sampling. The greatest value of Montage is expected to lie in its ability to analyze images at multiple wavelengths, delivering them on a common projection, coordinate system, and spatial sampling, and thereby enabling further analysis as though they were part of a single, multi-wavelength image. Montage will be deployed as a computation-intensive service through existing astronomy portals and other Web sites. It will be integrated into the emerging NVO architecture and will be executed on the TeraGrid. The Montage software will also be portable and publicly available.
Mathematics & Science in the Real World.
ERIC Educational Resources Information Center
Thorson, Annette, Ed.
2000-01-01
This issue of ENC Focus is organized around the theme of mathematics and science in the real world. It intends to provide teachers with practical resources and suggestions for science and mathematics education. Featured articles include: (1) "Real-World Learning: A Necessity for the Success of Current Reform Efforts" (Robert E. Yager); (2)…
Plants and Medicines. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
Science Spots AR: A Platform for Science Learning Games with Augmented Reality
ERIC Educational Resources Information Center
Laine, Teemu H.; Nygren, Eeva; Dirin, Amir; Suk, Hae-Jung
2016-01-01
Lack of motivation and of real-world relevance have been identified as reasons for low interest in science among children. Game-based learning and storytelling are prominent methods for generating intrinsic motivation in learning. Real-world relevance requires connecting abstract scientific concepts with the real world. This can be done by…
Carrying Loads on Heads. Third World Science.
ERIC Educational Resources Information Center
Jones, Natalie; Hughes, Wyn
This unit, developed by the Third World Science Project, is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless fascination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere;…
A study on state of Geospatial courses in Indian Universities
NASA Astrophysics Data System (ADS)
Shekhar, S.
2014-12-01
Today the world is dominated by three technologies such as Nano technology, Bio technology and Geospatial technology. This increases the huge demand for experts in the respective field for disseminating the knowledge as well as for an innovative research. Therefore, the prime need is to train the existing fraternity to gain progressive knowledge in these technologies and impart the same to student community. The geospatial technology faces some peculiar problem than other two technologies because of its interdisciplinary, multi-disciplinary nature. It attracts students and mid career professionals from various disciplines including Physics, Computer science, Engineering, Geography, Geology, Agriculture, Forestry, Town Planning and so on. Hence there is always competition to crab and stabilize their position. The students of Master's degree in Geospatial science are facing two types of problem. The first one is no unique identity in the academic field. Neither they are exempted for National eligibility Test for Lecturer ship nor given an opportunity to have the exam in geospatial science. The second one is differential treatment by the industrial world. The students are either given low grade jobs or poorly paid for their job. Thus, it is a serious issue about the future of this course in the Universities and its recognition in the academic and industrial world. The universities should make this course towards more job oriented in consultation with the Industries and Industries should come forward to share their demands and requirements to the Universities, so that necessary changes in the curriculum can be made to meet the industrial requirements.
Semantic Web technologies for the big data in life sciences.
Wu, Hongyan; Yamaguchi, Atsuko
2014-08-01
The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.
Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papka, M.; Messina, P.; Coffey, R.
The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursormore » to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to implement those algorithms. The Data Analytics and Visualization Team lends expertise in tools and methods for high-performance, post-processing of large datasets, interactive data exploration, batch visualization, and production visualization. The Operations Team ensures that system hardware and software work reliably and optimally; system tools are matched to the unique system architectures and scale of ALCF resources; the entire system software stack works smoothly together; and I/O performance issues, bug fixes, and requests for system software are addressed. The User Services and Outreach Team offers frontline services and support to existing and potential ALCF users. The team also provides marketing and outreach to users, DOE, and the broader community.« less
ERIC Educational Resources Information Center
University Coll. of North Wales, Bangor (United Kingdom). School of Education.
The Third World Science Project (TWSP) is designed to add a multicultural element to existing science syllabi (for students aged 11-16) in the United Kingdom. The project seeks to develop an appreciation of the: boundless facination of the natural world; knowledge, skills, and expertise possessed by men/women everywhere; application of knowledge…
den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph
2009-04-22
It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling "in silico" the processes observed "in vitro." The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. SwissBioGrid gave rise to two pilot projects: one for proteomics data analysis and the other for high-throughput molecular docking ("virtual screening") to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the interactions of millions of small molecules with a limited number of protein targets on the coat of the dengue virus. Both present interesting lessons about how scientific practices are changing when they tackle the problems of large-scale data analysis and data management by means of creating a novel technical infrastructure. In the experience of SwissBioGrid, data intensive discovery has a lot to gain from close collaboration with industry and harnessing distributed computing power. Yet the diversity in life science research implies only a limited role for generic infrastructure; and the transience of support means that researchers need to integrate their efforts with others if they want to sustain the benefits of their success, which are otherwise lost.
NASA Astrophysics Data System (ADS)
Vasant, Pandian; Barsoum, Nader
2008-10-01
Many engineering, science, information technology and management optimization problems can be considered as non linear programming real world problems where the all or some of the parameters and variables involved are uncertain in nature. These can only be quantified using intelligent computational techniques such as evolutionary computation and fuzzy logic. The main objective of this research paper is to solve non linear fuzzy optimization problem where the technological coefficient in the constraints involved are fuzzy numbers which was represented by logistic membership functions by using hybrid evolutionary optimization approach. To explore the applicability of the present study a numerical example is considered to determine the production planning for the decision variables and profit of the company.
A new decision sciences for complex systems.
Lempert, Robert J
2002-05-14
Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.
Using virtualization to protect the proprietary material science applications in volunteer computing
NASA Astrophysics Data System (ADS)
Khrapov, Nikolay P.; Rozen, Valery V.; Samtsevich, Artem I.; Posypkin, Mikhail A.; Sukhomlin, Vladimir A.; Oganov, Artem R.
2018-04-01
USPEX is a world-leading software for computational material design. In essence, USPEX splits simulation into a large number of workunits that can be processed independently. This scheme ideally fits the desktop grid architecture. Workunit processing is done by a simulation package aimed at energy minimization. Many of such packages are proprietary and should be protected from unauthorized access when running on a volunteer PC. In this paper we present an original approach based on virtualization. In a nutshell, the proprietary code and input files are stored in an encrypted folder and run inside a virtual machine image that is also password protected. The paper describes this approach in detail and discusses its application in USPEX@home volunteer project.
The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)
NASA Astrophysics Data System (ADS)
2017-09-01
The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary
Multimedia: The Brave New World of Buckytubes | ScienceCinema
Multimedia: The Brave New World of Buckytubes Citation Details Title: The Brave New World of Buckytubes In a talk titled "The Brave New World of Buckytubes," Smalley discusses the basic science , anmore »alysis, and assembly of buckytubes for solving real-world technological problems.« less Title
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Lightwave: An interactive estimation of indirect illumination using waves of light
NASA Astrophysics Data System (ADS)
Robertson, Michael
With the growth of computers and technology, so to has grown the desire to accurately recreate our world using computer graphics. However, our world is very complex and in many ways beyond our comprehension. Therefore, in order to perform this task, we must consider multiple disciplines and areas of research including physics, mathematics, optics, geology, and many more to at the very least approximate the world around us. The applications of being able to do this are plentiful as well, including the use of graphics in entertainment such as movies and games, in science such as weather forecasts and simulations, in medicine with body scans, or used in architecture, design, and many other areas. In order to recreate the world around us, an important task is to accurately recreate the way light travels and affects the objects we see. Rendering lighting has been a heavily researched area since the 1970's and has gotten more sophisticated over the years. Until recent developments in technology, realistic lighting of scenes has only been achievable offline taking seconds to hours or more to create a single image, however, due to advances in graphics technology, realistic lighting can be done in real-time. An important aspect of realistic lighting involves the inclusion of indirect illumination. However, to achieve a real-time rendering with indirect illumination, we must make trade-offs between scientific accuracy and performance, but as will be discussed later, scientific accuracy may not be necessary after all.
The (human) science of medical virtual learning environments
Stone, Robert J.
2011-01-01
The uptake of virtual simulation technologies in both military and civilian surgical contexts has been both slow and patchy. The failure of the virtual reality community in the 1990s and early 2000s to deliver affordable and accessible training systems stems not only from an obsessive quest to develop the ‘ultimate’ in so-called ‘immersive’ hardware solutions, from head-mounted displays to large-scale projection theatres, but also from a comprehensive lack of attention to the needs of the end users. While many still perceive the science of simulation to be defined by technological advances, such as computing power, specialized graphics hardware, advanced interactive controllers, displays and so on, the true science underpinning simulation—the science that helps to guarantee the transfer of skills from the simulated to the real—is that of human factors, a well-established discipline that focuses on the abilities and limitations of the end user when designing interactive systems, as opposed to the more commercially explicit components of technology. Based on three surgical simulation case studies, the importance of a human factors approach to the design of appropriate simulation content and interactive hardware for medical simulation is illustrated. The studies demonstrate that it is unnecessary to pursue real-world fidelity in all instances in order to achieve psychological fidelity—the degree to which the simulated tasks reproduce and foster knowledge, skills and behaviours that can be reliably transferred to real-world training applications. PMID:21149363
Noel, Jean-Paul; Blanke, Olaf; Serino, Andrea
2018-06-06
Integrating information across sensory systems is a critical step toward building a cohesive representation of the environment and one's body, and as illustrated by numerous illusions, scaffolds subjective experience of the world and self. In the last years, classic principles of multisensory integration elucidated in the subcortex have been translated into the language of statistical inference understood by the neocortical mantle. Most importantly, a mechanistic systems-level description of multisensory computations via probabilistic population coding and divisive normalization is actively being put forward. In parallel, by describing and understanding bodily illusions, researchers have suggested multisensory integration of bodily inputs within the peripersonal space as a key mechanism in bodily self-consciousness. Importantly, certain aspects of bodily self-consciousness, although still very much a minority, have been recently casted under the light of modern computational understandings of multisensory integration. In doing so, we argue, the field of bodily self-consciousness may borrow mechanistic descriptions regarding the neural implementation of inference computations outlined by the multisensory field. This computational approach, leveraged on the understanding of multisensory processes generally, promises to advance scientific comprehension regarding one of the most mysterious questions puzzling humankind, that is, how our brain creates the experience of a self in interaction with the environment. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
Topics in computational physics
NASA Astrophysics Data System (ADS)
Monville, Maura Edelweiss
Computational Physics spans a broad range of applied fields extending beyond the border of traditional physics tracks. Demonstrated flexibility and capability to switch to a new project, and pick up the basics of the new field quickly, are among the essential requirements for a computational physicist. In line with the above mentioned prerequisites, my thesis described the development and results of two computational projects belonging to two different applied science areas. The first project is a Materials Science application. It is a prescription for an innovative nano-fabrication technique that is built out of two other known techniques. The preliminary results of the simulation of this novel nano-patterning fabrication method show an average improvement, roughly equal to 18%, with respect to the single techniques it draws on. The second project is a Homeland Security application aimed at preventing smuggling of nuclear material at ports of entry. It is concerned with a simulation of an active material interrogation system based on the analysis of induced photo-nuclear reactions. This project consists of a preliminary evaluation of the photo-fission implementation in the more robust radiation transport Monte Carlo codes, followed by the customization and extension of MCNPX, a Monte Carlo code developed in Los Alamos National Laboratory, and MCNP-PoliMi. The final stage of the project consists of testing the interrogation system against some real world scenarios, for the purpose of determining the system's reliability, material discrimination power, and limitations.
Towards a More Authentic Science Curriculum: The contribution of out-of-school learning
NASA Astrophysics Data System (ADS)
Braund, Martin; Reiss, Michael
2006-10-01
In many developed countries of the world, pupil attitudes to school science decline progressively across the age range of secondary schooling while fewer students are choosing to study science at higher levels and as a career. Responses to these developments have included proposals to reform the curriculum, pedagogy, and the nature of pupil discussion in science lessons. We support such changes but argue that far greater use needs to be made of out-of-school sites in the teaching of science. Such usage will result in a school science education that is more valid and more motivating. We present an “evolutionary model” of science teaching that looks at where learning and teaching take place, and draws together thinking about the history of science and developments in the nature of learning over the past 100 years or so. Our contention is that laboratory-based school science teaching needs to be complemented by out-of-school science learning that draws on the actual world (e.g., through fieldtrips), the presented world (e.g., in science centres, botanic gardens, zoos and science museums), and the virtual worlds that are increasingly available through information technologies.
Practical skills of the future innovator
NASA Astrophysics Data System (ADS)
Kaurov, Vitaliy
2015-03-01
Physics graduates face and often are disoriented by the complex and turbulent world of startups, incubators, emergent technologies, big data, social network engineering, and so on. In order to build the curricula that foster the skills necessary to navigate this world, we will look at the experiences at the Wolfram Science Summer School that gathers annually international students for already more than a decade. We will look at the examples of projects and see the development of such skills as innovative thinking, data mining, machine learning, cloud technologies, device connectivity and the Internet of things, network analytics, geo-information systems, formalized computable knowledge, and the adjacent applied research skills from graph theory to image processing and beyond. This should give solid ideas to educators who will build standard curricula adapted for innovation and entrepreneurship education.
The Early Years of Molecular Dynamics and Computers at UCRL, LRL, LLL, and LLNL
NASA Astrophysics Data System (ADS)
Mansigh Karlsen, Mary Ann
I'm the young woman in the picture shown in Fig. 12.1 that appeared with the invitation to the Symposium to celebrate Berni Alder's ninetieth birthday. I worked with Berni for over 25 years on the computer programs that provided the data he needed to write the fifteen papers published in scientific journals on Studies in Molecular Dynamics. My name appears at the end of each one thanking me for computer support. It has been interesting to look on the Internet to find my name in the middle of many foreign languages, including Japanese characters and Russian Cyrillic script. It shows how Berni's work has been of interest to many scientists all over the world from the earliest years. Figure 12.1 was also included with articles written when he received the National Medal of Science from President Obama in 2009…
NASA Astrophysics Data System (ADS)
Knosp, B.; Neely, S.; Zimdars, P.; Mills, B.; Vance, N.
2007-12-01
The Microwave Limb Sounder (MLS) Science Computing Facility (SCF) stores over 50 terabytes of data, has over 240 computer processing hosts, and 64 users from around the world. These resources are spread over three primary geographical locations - the Jet Propulsion Laboratory (JPL), Raytheon RIS, and New Mexico Institute of Mining and Technology (NMT). A need for a grid network system was identified and defined to solve the problem of users competing for finite, and increasingly scarce, MLS SCF computing resources. Using Sun's Grid Engine software, a grid network was successfully created in a development environment that connected the JPL and Raytheon sites, established master and slave hosts, and demonstrated that transfer queues for jobs can work among multiple clusters in the same grid network. This poster will first describe MLS SCF resources and the lessons that were learned in the design and development phase of this project. It will then go on to discuss the test environment and plans for deployment by highlighting benchmarks and user experiences.
NASA Astrophysics Data System (ADS)
Morrison, Foster
2009-06-01
Imagine a story about a stay-at-home mother who, anticipating the departure of her children for college, takes a job at a government agency and by dint of hard work and persistence becomes a world-renowned scientist. This might sound improbable, but it happens to be the true story of Irene K. Fischer, a geodesist and AGU Fellow. How it happened and the way it did is a fascinating and complex story. In 1952, Fischer started working at the U.S. Army Map Service (AMS) in Brookmont, Md. (now part of Bethesda), at a time when computers were large, expensive, and feeble compared with the cheapest desktop personal computers available today. Much computing was still done on slow and noisy mechanical calculators. Artificial satellites, space probes, global positioning systems, and the like were science fiction fantasies.
Discover the Cosmos - Bringing Cutting Edge Science to Schools across Europe
NASA Astrophysics Data System (ADS)
Doran, Rosa
2015-03-01
The fast growing number of science data repositories is opening enormous possibilities to scientists all over the world. The emergence of citizen science projects is engaging in science discovery a large number of citizens globally. Astronomical research is now a possibility to anyone having a computer and some form of data access. This opens a very interesting and strategic possibility to engage large audiences in the making and understanding of science. On another perspective it would be only natural to imagine that soon enough data mining will be an active part of the academic path of university or even secondary schools students. The possibility is very exciting but the road not very promising. Even in the most developed nations, where all schools are equipped with modern ICT facilities the use of such possibilities is still a very rare episode. The Galileo Teacher Training Program GTTP, a legacy of IYA2009, is participating in some of the most emblematic projects funded by the European Commission and targeting modern tools, resources and methodologies for science teaching. One of this projects is Discover the Cosmos which is aiming to target this issue by empowering educators with the necessary skills to embark on this innovative path: teaching science while doing science.
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
Helfer, Peter; Shultz, Thomas R
2014-12-01
The widespread availability of calorie-dense food is believed to be a contributing cause of an epidemic of obesity and associated diseases throughout the world. One possible countermeasure is to empower consumers to make healthier food choices with useful nutrition labeling. An important part of this endeavor is to determine the usability of existing and proposed labeling schemes. Here, we report an experiment on how four different labeling schemes affect the speed and nutritional value of food choices. We then apply decision field theory, a leading computational model of human decision making, to simulate the experimental results. The psychology experiment shows that quantitative, single-attribute labeling schemes have greater usability than multiattribute and binary ones, and that they remain effective under moderate time pressure. The computational model simulates these psychological results and provides explanatory insights into them. This work shows how experimental psychology and computational modeling can contribute to the evaluation and improvement of nutrition-labeling schemes. © 2014 New York Academy of Sciences.
Davis, Bernard D.
2000-01-01
This paper describes the features of the world of science, and it compares that world briefly with that of politics and the law. It also discusses some “postmodern” trends in philosophy and sociology that have been undermining confidence in the objectivity of science and thus have contributed indirectly to public mistrust. The paper includes broader implications of interactions of government and science. PMID:10704471
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willis, D. K.
2016-12-01
High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC ismore » the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.« less
Integrating Mathematics, Science, and Language Arts Instruction Using the World Wide Web.
ERIC Educational Resources Information Center
Clark, Kenneth; Hosticka, Alice; Kent, Judi; Browne, Ron
1998-01-01
Addresses issues of access to World Wide Web sites, mathematics and science content-resources available on the Web, and methods for integrating mathematics, science, and language arts instruction. (Author/ASK)
Public Outreach at RAL: Engaging the Next Generation of Scientists and Engineers
NASA Astrophysics Data System (ADS)
Corbett, G.; Ryall, G.; Palmer, S.; Collier, I. P.; Adams, J.; Appleyard, R.
2015-12-01
The Rutherford Appleton Laboratory (RAL) is part of the UK's Science and Technology Facilities Council (STFC). As part of the Royal Charter that established the STFC, the organisation is required to generate public awareness and encourage public engagement and dialogue in relation to the science undertaken. The staff at RAL firmly support this activity as it is important to encourage the next generation of students to consider studying Science, Technology, Engineering, and Mathematics (STEM) subjects, providing the UK with a highly skilled work-force in the future. To this end, the STFC undertakes a variety of outreach activities. This paper will describe the outreach activities undertaken by RAL, particularly focussing on those of the Scientific Computing Department (SCD). These activities include: an Arduino based activity day for 12-14 year-olds to celebrate Ada Lovelace day; running a centre as part of the Young Rewired State - encouraging 11-18 year-olds to create web applications with open data; sponsoring a team in the Engineering Education Scheme - supporting a small team of 16-17 year-olds to solve a real world engineering problem; as well as the more traditional tours of facilities. These activities could serve as an example for other sites involved in scientific computing around the globe.
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
NASA Astrophysics Data System (ADS)
Gossard, Paula Rae
Authors of recent science reform documents promote the goal of scientific literacy for all Americans (American Association for the Advancement of Science, 1989, 1993). Some students, however, feel apprehensive about learning science due to perceptions that science is antagonistic to their world views (Alters, 2005; Esbenshade, 1993). This study investigated the effect of an introductory science course taught in the context of a Christian, theistic world view on the scientific compatibility of religious college students' world views. For the purposes of this study, students' understanding of the nature of science, affective attitudes toward science, and beliefs regarding creation were used as indicators of the scientific compatibility of their world views. One hundred and seventy-one students enrolled in a core curriculum, introductory science course at a Christian university participated in this study by completing pre-instruction and post-instruction survey packets that included demographic information, the Student Understanding of Science and Scientific Inquiry questionnaire (Liang et al., 2006), the Affective Attitude toward Science Scale (Francis & Greer, 1999), and the Origins Survey (Tenneson & Badger, personal communication, June, 2008). Two-tailed paired samples t tests were used to test for significant mean differences in the indicator variables at a .05 level before and after instruction. Pearson correlation coefficients were calculated to determine if relationships were present among the indicator variables at a .05 level before and after instruction. Students' self-identified positions regarding creation were analyzed using a chi-square contingency table. Results indicated that there were statistically significant changes in all indicator variables after instruction of the contextualized course. The direction of these changes and shifts in students' self-identified positions regarding creation supported the conclusion that students developed a more scentifically compatible world view after contextualized instruction based on the indicators used in this study. Weak positive correlations were found between nature of science understanding and young earth creation before and after instruction; weak negative correlations were found between nature of science understanding and old earth creation and evolutionary creation before, but not after, instruction. Conclusions, implications for practice, and recommendations for future research are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Ann E; Bland, Arthur S Buddy; Hack, James J
Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor thatmore » uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where appropriate, changes in Center metrics were introduced. This report covers CY 2010 and CY 2011 Year to Date (YTD) that unless otherwise specified, denotes January 1, 2011 through June 30, 2011. User Support remains an important element of the OLCF operations, with the philosophy 'whatever it takes' to enable successful research. Impact of this center-wide activity is reflected by the user survey results that show users are 'very satisfied.' The OLCF continues to aggressively pursue outreach and training activities to promote awareness - and effective use - of U.S. leadership-class resources (Reference Section 2). The OLCF continues to meet and in many cases exceed DOE metrics for capability usage (35% target in CY 2010, delivered 39%; 40% target in CY 2011, 54% January 1, 2011 through June 30, 2011). The Schedule Availability (SA) and Overall Availability (OA) for Jaguar were exceeded in CY2010. Given the solution to the VRM problem the SA and OA for Jaguar in CY 2011 are expected to exceed the target metrics of 95% and 90%, respectively (Reference Section 3). Numerous and wide-ranging research accomplishments, scientific support, and technological innovations are more fully described in Sections 4 and 6 and reflect OLCF leadership in enabling high-impact science solutions and vision in creating an exascale-ready center. Financial Management (Section 5) and Risk Management (Section 7) are carried out using best practices approved of by DOE. The OLCF has a valid cyber security plan and Authority to Operate (Section 8). The proposed metrics for 2012 are reflected in Section 9.« less
Building an Integrated Environment for Multimedia
NASA Technical Reports Server (NTRS)
1997-01-01
Multimedia courseware on the solar system and earth science suitable for use in elementary, middle, and high schools was developed under this grant. The courseware runs on Silicon Graphics, Incorporated (SGI) workstations and personal computers (PCs). There is also a version of the courseware accessible via the World Wide Web. Accompanying multimedia database systems were also developed to enhance the multimedia courseware. The database systems accompanying the PC software are based on the relational model, while the database systems accompanying the SGI software are based on the object-oriented model.
Multimedia Information Retrieval Literature Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Pak C.; Bohn, Shawn J.; Payne, Deborah A.
This survey paper highlights some of the recent, influential work in multimedia information retrieval (MIR). MIR is a branch area of multimedia (MM). The young and fast-growing area has received strong industrial and academic support in the United States and around the world (see Section 7 for a list of major conferences and journals of the community). The term "information retrieval" may be misleading to those with different computer science or information technology backgrounds. As shown in our discussion later, it indeed includes topics from user interaction, data analytics, machine learning, feature extraction, information visualization, and more.
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
How to build better memory training games
Deveau, Jenni; Jaeggi, Susanne M.; Zordan, Victor; Phung, Calvin; Seitz, Aaron R.
2015-01-01
Can we create engaging training programs that improve working memory (WM) skills? While there are numerous procedures that attempt to do so, there is a great deal of controversy regarding their efficacy. Nonetheless, recent meta-analytic evidence shows consistent improvements across studies on lab-based tasks generalizing beyond the specific training effects (Au et al., 2014; Karbach and Verhaeghen, 2014), however, there is little research into how WM training aids participants in their daily life. Here we propose that incorporating design principles from the fields of Perceptual Learning (PL) and Computer Science might augment the efficacy of WM training, and ultimately lead to greater learning and transfer. In particular, the field of PL has identified numerous mechanisms (including attention, reinforcement, multisensory facilitation and multi-stimulus training) that promote brain plasticity. Also, computer science has made great progress in the scientific approach to game design that can be used to create engaging environments for learning. We suggest that approaches integrating knowledge across these fields may lead to a more effective WM interventions and better reflect real world conditions. PMID:25620916
Cognitive biases, linguistic universals, and constraint-based grammar learning.
Culbertson, Jennifer; Smolensky, Paul; Wilson, Colin
2013-07-01
According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology-the distribution of linguistic patterns across the world's languages-and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern of results of artificial grammar experiments on noun-phrase word order (Culbertson, Smolensky, & Legendre, 2012). Our proposal has several novel properties that distinguish it from prior work in the domains of linguistic theory, computational cognitive science, and machine learning. This study illustrates how ideas from these domains can be synthesized into a model of language learning in which biases range in strength from hard (absolute) to soft (statistical), and in which language-specific and domain-general biases combine to account for data from the macro-level scale of typological distribution to the micro-level scale of learning by individuals. Copyright © 2013 Cognitive Science Society, Inc.
National Institute of Environmental Health Sciences Kids' Pages
... Recycle Science – How It Works The Natural World Games Brainteasers Puzzles Riddles Songs Activities Be a Scientist ... Recycle Science – How It Works The Natural World Games Expand Brainteasers Puzzles Riddles Songs Activities Expand Be ...
Women Working in Engineering and Science
NASA Technical Reports Server (NTRS)
Luna, Bernadette; Kliss, Mark (Technical Monitor)
1998-01-01
The presentation will focus on topics of interest to young women pursuing an engineering or scientific career, such as intrinsic personality traits of most engineers, average salaries for the various types of engineers, appropriate preparation classes at the high school and undergraduate levels, gaining experience through internships, summer jobs and graduate school, skills necessary but not always included in engineering curricula (i.e., multimedia, computer skills, communication skills), the work environment, balancing family and career, and sexual harassment. Specific examples from the speaker's own experience in NASA's Space Life Sciences Program will be used to illustrate the above topics. In particular, projects from Extravehicular Activity and Protective Systems research and Regenerative Life Support research will be used as examples of real world problem-solving to enable human exploration of the solar system.
Workshop on Incomplete Network Data Held at Sandia National Labs – Livermore
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soundarajan, Sucheta; Wendt, Jeremy D.
2016-06-01
While network analysis is applied in a broad variety of scientific fields (including physics, computer science, biology, and the social sciences), how networks are constructed and the resulting bias and incompleteness have drawn more limited attention. For example, in biology, gene networks are typically developed via experiment -- many actual interactions are likely yet to be discovered. In addition to this incompleteness, the data-collection processes can introduce significant bias into the observed network datasets. For instance, if you observe part of the World Wide Web network through a classic random walk, then high degree nodes are more likely to bemore » found than if you had selected nodes at random. Unfortunately, such incomplete and biasing data collection methods must be often used.« less
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
Timpka, T
2001-08-01
In an analysis departing from the global health situation, the foundation for a change of paradigm in health informatics based on socially embedded information infrastructures and technologies is identified and discussed. It is shown how an increasing computing and data transmitting capacity can be employed for proactive health computing. As a foundation for ubiquitous health promotion and prevention of disease and injury, proactive health systems use data from multiple sources to supply individuals and communities evidence-based information on means to improve their state of health and avoid health risks. The systems are characterised by: (1) being profusely connected to the world around them, using perceptual interfaces, sensors and actuators; (2) responding to external stimuli at faster than human speeds; (3) networked feedback loops; and (4) humans remaining in control, while being left outside the primary computing loop. The extended scientific mission of this new partnership between computer science, electrical engineering and social medicine is suggested to be the investigation of how the dissemination of information and communication technology on democratic grounds can be made even more important for global health than sanitation and urban planning became a century ago.
Molecular computational elements encode large populations of small objects
NASA Astrophysics Data System (ADS)
Prasanna de Silva, A.; James, Mark R.; McKinney, Bernadine O. F.; Pears, David A.; Weir, Sheenagh M.
2006-10-01
Since the introduction of molecular computation, experimental molecular computational elements have grown to encompass small-scale integration, arithmetic and games, among others. However, the need for a practical application has been pressing. Here we present molecular computational identification (MCID), a demonstration that molecular logic and computation can be applied to a widely relevant issue. Examples of populations that need encoding in the microscopic world are cells in diagnostics or beads in combinatorial chemistry (tags). Taking advantage of the small size (about 1nm) and large `on/off' output ratios of molecular logic gates and using the great variety of logic types, input chemical combinations, switching thresholds and even gate arrays in addition to colours, we produce unique identifiers for members of populations of small polymer beads (about 100μm) used for synthesis of combinatorial libraries. Many millions of distinguishable tags become available. This method should be extensible to far smaller objects, with the only requirement being a `wash and watch' protocol. Our focus on converting molecular science into technology concerning analog sensors, turns to digital logic devices in the present work.
Molecular computational elements encode large populations of small objects.
de Silva, A Prasanna; James, Mark R; McKinney, Bernadine O F; Pears, David A; Weir, Sheenagh M
2006-10-01
Since the introduction of molecular computation, experimental molecular computational elements have grown to encompass small-scale integration, arithmetic and games, among others. However, the need for a practical application has been pressing. Here we present molecular computational identification (MCID), a demonstration that molecular logic and computation can be applied to a widely relevant issue. Examples of populations that need encoding in the microscopic world are cells in diagnostics or beads in combinatorial chemistry (tags). Taking advantage of the small size (about 1 nm) and large 'on/off' output ratios of molecular logic gates and using the great variety of logic types, input chemical combinations, switching thresholds and even gate arrays in addition to colours, we produce unique identifiers for members of populations of small polymer beads (about 100 microm) used for synthesis of combinatorial libraries. Many millions of distinguishable tags become available. This method should be extensible to far smaller objects, with the only requirement being a 'wash and watch' protocol. Our focus on converting molecular science into technology concerning analog sensors, turns to digital logic devices in the present work.
Computationally driven drug discovery meeting-3 - Verona (Italy): 4 - 6th of March 2014.
Costantino, Gabriele
2014-12-01
The following article reports on the results and the outcome of a meeting organised at the Aptuit Auditorium in Verona (Italy), which highlighted the current applications of state-of-the-art computational science to drug design in Italy. The meeting, which had > 100 people in attendance, consisted of over 40 presentations and included keynote lectures given by world-renowned speakers. The topics included in the meeting are areas related to ligand and structure-based ligand design and library design and screening; it also provided discussion pertaining to chemometrics. The meeting also stressed the importance of public-private collaboration and reviewed the different approaches to computationally driven drug discovery taken within academia and industry. The meeting helped define the current position of state-of-the-art computational drug discovery in Italy, pointing out criticalities and assets. This kind of focused meeting is important in the sense that it lends the opportunity of a restricted yet representative community of fellow professionals to deeply discuss the current methodological approaches and provide future perspectives for computationally driven drug discovery.
Climate Science's Globally Distributed Infrastructure
NASA Astrophysics Data System (ADS)
Williams, D. N.
2016-12-01
The Earth System Grid Federation (ESGF) is primarily funded by the Department of Energy's (DOE's) Office of Science (the Office of Biological and Environmental Research [BER] Climate Data Informatics Program and the Office of Advanced Scientific Computing Research Next Generation Network for Science Program), the National Oceanic and Atmospheric Administration (NOAA), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF), the European Infrastructure for the European Network for Earth System Modeling (IS-ENES), and the Australian National University (ANU). Support also comes from other U.S. federal and international agencies. The federation works across multiple worldwide data centers and spans seven international network organizations to provide users with the ability to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a series of geographically distributed peer nodes that are independently administered and united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP; output used by the Intergovernmental Panel on Climate Change assessment reports), multiple model intercomparison projects (MIPs; endorsed by the World Climate Research Programme [WCRP]), and the Accelerated Climate Modeling for Energy (ACME; ESGF is included in the overarching ACME workflow process to store model output). ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs the global climate science community. Data served by ESGF includes not only model output but also observational data from satellites and instruments, reanalysis, and generated images.
NASA Astrophysics Data System (ADS)
de Groot, R. M.; Benthien, M. L.
2006-12-01
The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.
WorldWideScience.org: the global science gateway.
Fitzpatrick, Roberta Bronson
2009-10-01
WorldWideScience.org is a Web-based global gateway connecting users to both national and international scientific databases and portals. This column will provide background information on the resource as well as introduce basic searching practices for users.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
A Financial Technology Entrepreneurship Program for Computer Science Students
ERIC Educational Resources Information Center
Lawler, James P.; Joseph, Anthony
2011-01-01
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
Science Professionals: Master's Education for a Competitive World
ERIC Educational Resources Information Center
National Academies Press, 2008
2008-01-01
What are employer needs for staff trained in the natural sciences at the master's degree level? How do master's level professionals in the natural sciences contribute in the workplace? How do master's programs meet or support educational and career goals? "Science Professionals: Master's Education for a Competitive World" examines the answers to…
Use of Future Scenarios as a Pedagogical Approach for Science Teacher Education
ERIC Educational Resources Information Center
Paige, Kathryn; Lloyd, David
2016-01-01
Futures studies is usually a transdisciplinary study and as such embraces the physical world of the sciences and system sciences and the subjective world of individuals and cultures, as well as the time dimension--past, present and futures. Science education, where student interests, opportunities and challenges often manifest themselves, can…
Reflections on Science Fiction in Light of Today's Global Concerns.
ERIC Educational Resources Information Center
Aiex, Patrick K.
Science fiction is a literary genre that can be used in humanities courses to discuss ideas, attitudes, ethics, morality, and the effects of science and technology on the world's population. One of the best examples of a "classic" science fiction novel which can provoke class discussion is Aldous Huxley's "Brave New World,"…
Grossi, Enzo
2007-01-01
The author describes a refiguration of medical thought that originates from nonlinear dynamics and chaos theory. The coupling of computer science and these new theoretical bases coming from complex systems mathematics allows the creation of "intelligent" agents capable of adapting themselves dynamically to problems of high complexity: the artificial neural networks (ANNs). ANNs are able to reproduce the dynamic interaction of multiple factors simultaneously, allowing the study of complexity; they can also draw conclusions on an individual basis and not as average trends. These tools can allow a more efficient technology transfer from the science of medicine to the real world, overcoming many obstacles responsible for the present translational failure. They also contribute to a new holistic vision of the human subject person, contrasting the statistical reductionism that tends to squeeze or even delete the single subject, sacrificing him to his group of belongingness. A remarkable contribution to this individual approach comes from fuzzy logic, according to which there are no sharp limits between opposite things, such as wealth and disease. This approach allows one to partially escape from the probability theory trap in situations where it is fundamental to express a judgement based on a single case and favor a novel humanism directed to the management of the patient as an individual subject person.
NASA Astrophysics Data System (ADS)
Harteveld, Casper
Designing a game with a serious purpose involves considering the worlds of Reality and Meaning yet it is undeniably impossible to create a game without a third world, one that is specifically concerned with what makes a game a game: the play elements. This third world, the world of people like designers and artists, and disciplines as computer science and game design, I call the world of Play and this level is devoted to it. The level starts off with some of the misperceptions people have of play. Unlike some may think, we play all the time, even when we grow old—this was also very noticeable in designing the game Levee Patroller as the team exhibited very playful behavior at many occasions. From there, I go into the aspects that characterize this world. The first concerns the goal of the game. This relates to the objectives people have to achieve within the game. This is constituted by the second aspect: the gameplay. Taking actions and facing challenges is subsequently constituted by a gameworld, which concerns the third aspect. And all of it is not possible without the fourth and final aspect, the type of technology that creates and facilitates the game. The four aspects together make up a “game concept” and from this world such a concept can be judged on the basis of three closely interrelated criteria: engagement, immersion, and fun.
ERIC Educational Resources Information Center
Menekse, Muhsin
2015-01-01
While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…
NASA Astrophysics Data System (ADS)
Woodcock, R.
2013-12-01
Australia's AuScope provides world class research infrastructure as a framework for understanding the structure and evolution of the Australian continent. Since it conception in 2005, Data Scientists have led the Grid and Interoperability component of AuScope. The AuScope Grid is responsible for the effective management, curation, preservation and analysis of earth science data across the many organisations collaborating in AuScope. During this journey much was learned about technology and architectures but even more about organisations and people, and the role of Data Scientists in the science ecosystem. With the AuScope Grid now in operation and resulting techniques and technologies now underpinning Australian Government initiatives in solid earth and environmental information, it is beneficial to reflect upon the journey and observe what has been learned in order to make data science routine. The role of the Data Scientist is a hybrid one, of not quite belonging and yet highly valued. With the skills to support domain scientists with data and computational needs and communicate across domains, yet not quite able to do the domain science itself. A bridge between two worlds, there is tremendous satisfaction from a job well done, but paradoxically it is also best when it is unnoticeable. In the years since AuScope started much has changed for the Data Scientist. Initially misunderstood, Data Scientists are now a recognisable part of the science landscape in Australia. Whilst the rewards and incentives are still catching up, there is wealth of knowledge on the technical and soft skills required and recognition of the need for Data Scientists. These will be shared from the AuScope journey so other pilgrims may progress well.
mORCA: ubiquitous access to life science web services.
Diaz-Del-Pino, Sergio; Trelles, Oswaldo; Falgueras, Juan
2018-01-16
Technical advances in mobile devices such as smartphones and tablets have produced an extraordinary increase in their use around the world and have become part of our daily lives. The possibility of carrying these devices in a pocket, particularly mobile phones, has enabled ubiquitous access to Internet resources. Furthermore, in the life sciences world there has been a vast proliferation of data types and services that finish as Web Services. This suggests the need for research into mobile clients to deal with life sciences applications for effective usage and exploitation. Analysing the current features in existing bioinformatics applications managing Web Services, we have devised, implemented, and deployed an easy-to-use web-based lightweight mobile client. This client is able to browse, select, compose parameters, invoke, and monitor the execution of Web Services stored in catalogues or central repositories. The client is also able to deal with huge amounts of data between external storage mounts. In addition, we also present a validation use case, which illustrates the usage of the application while executing, monitoring, and exploring the results of a registered workflow. The software its available in the Apple Store and Android Market and the source code is publicly available in Github. Mobile devices are becoming increasingly important in the scientific world due to their strong potential impact on scientific applications. Bioinformatics should not fall behind this trend. We present an original software client that deals with the intrinsic limitations of such devices and propose different guidelines to provide location-independent access to computational resources in bioinformatics and biomedicine. Its modular design makes it easily expandable with the inclusion of new repositories, tools, types of visualization, etc.
NASA Astrophysics Data System (ADS)
Paris, Elizabeth
The ``November Revolution'' of 1974 and the experiments that followed consolidated the place of the Standard Model in modern particle physics. Much of the evidence on which these conclusions depended was generated by a new type of tool: colliding beam storage rings, which had been considered physically unfeasible twenty years earlier. In 1956 a young experimentalist named Gerry O'Neill dedicated himself to demonstrating that such an apparatus could do useful physics. The storage ring movement encountered numerous obstacles before generating one of the standard machines for high energy research. In fact, it wasn't until 1970 that the U.S. finally broke ground on its first electron-positron collider. Drawing extensively on archival sources and supplementing them with the personal accounts of many of the individuals who took part, Ringing in the New Physics examines this instance of post-World War II techno-science and the new social, political and scientific tensions that characterize it. The motivations are twofold: first, that the chronicle of storage rings may take its place beside mathematical group theory, computer simulations, magnetic spark chambers, and the like as an important contributor to a view of matter and energy which has been the dominant model for the last twenty-five years. In addition, the account provides a case study for the integration of the personal, professional, institutional, and material worlds when examining an episode in the history or sociology of twentieth century science. The story behind the technological development of storage rings holds fascinating insights into the relationship between theory and experiment, collaboration and competition in the physics community, the way scientists obtain funding and their responsibilities to it, and the very nature of what constitutes ``successful'' science in the post- World War II era.
Careers in Data Science: A Berkeley Perspective
NASA Astrophysics Data System (ADS)
Koy, K.
2015-12-01
Last year, I took on an amazing opportunity to serve as the Executive Director of the new Berkeley Institute for Data Science (BIDS). After a 15-year career working with geospatial data to advance our understanding of the environment, I have been presented with a unique opportunity through BIDS to work with talented researchers from a wide variety of backgrounds. Founded in 2013, BIDS is a central hub of research and education at UC Berkeley designed to facilitate and nurture data-intensive science. We are building a community centered on a cohort of talented data science fellows and senior fellows who are representative of the world-class researchers from across our campus and are leading the data science revolution within their disciplines. Our initiatives are designed to bring together broad constituents of the data science community, including domain experts from the life, social, and physical sciences and methodological experts from computer science, statistics, and applied mathematics. While many of these individuals rarely cross professional paths, BIDS actively seeks new and creative ways to engage and foster collaboration across these different research fields. In this presentation, I will share my own story, along with some insights into how BIDS is supporting the careers of data scientists, including graduate students, postdocs, faculty, and research staff. I will also describe how these individuals we are helping support are working to address a number of data science-related challenges in scientific research.
Yang, Jack Y; Niemierko, Andrzej; Bajcsy, Ruzena; Xu, Dong; Athey, Brian D; Zhang, Aidong; Ersoy, Okan K; Li, Guo-Zheng; Borodovsky, Mark; Zhang, Joe C; Arabnia, Hamid R; Deng, Youping; Dunker, A Keith; Liu, Yunlong; Ghafoor, Arif
2010-12-01
Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT Austin), Dr. Aidong Zhang (Buffalo) and Dr. Zhi-Hua Zhou (Nanjing) for their significant contributions to the field of intelligent biological medicine.
2010-01-01
Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT Austin), Dr. Aidong Zhang (Buffalo) and Dr. Zhi-Hua Zhou (Nanjing) for their significant contributions to the field of intelligent biological medicine. PMID:21143775
The Place of Science in the Modern World: A Speech by Robert Millikan
NASA Astrophysics Data System (ADS)
Williams, Kathryn R.
2001-07-01
A speech by Robert Millikan, reprinted in the May 1930 issue, pertains to issues still prevalent in the 21st century. In the "The Place of Science in the Modern World", the Nobel laureate defends science against charges of its detrimental effects on society, its materialistic intentions, and the destructive powers realized during the first World War. He also expresses concern that "this particular generation of Americans" may lack the moral qualities needed to make responsible use of the increased powers afforded by modern science.
ERIC Educational Resources Information Center
Ogunleye, Ayodele O.
2009-01-01
The current move toward "science for all" in all parts of the globe necessitates that consideration be given to how pupils move between their everyday life and the world of school science, how pupils deal with cognitive conflicts between those two worlds, and what this means for effective teaching of science. In recent times,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallarno, George; Rogers, James H; Maxwell, Don E
The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learnedmore » in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.« less
A Look at the Impact of High-End Computing Technologies on NASA Missions
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Dunbar, Jill; Hardman, John; Bailey, F. Ron; Wheeler, Lorien; Rogers, Stuart
2012-01-01
From its bold start nearly 30 years ago and continuing today, the NASA Advanced Supercomputing (NAS) facility at Ames Research Center has enabled remarkable breakthroughs in the space agency s science and engineering missions. Throughout this time, NAS experts have influenced the state-of-the-art in high-performance computing (HPC) and related technologies such as scientific visualization, system benchmarking, batch scheduling, and grid environments. We highlight the pioneering achievements and innovations originating from and made possible by NAS resources and know-how, from early supercomputing environment design and software development, to long-term simulation and analyses critical to design safe Space Shuttle operations and associated spinoff technologies, to the highly successful Kepler Mission s discovery of new planets now capturing the world s imagination.
Computer Science | Classification | College of Engineering & Applied
EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer
Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?
ERIC Educational Resources Information Center
Schrock, John Richard
1984-01-01
Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…
Atmosphere of Freedom: Sixty Years at the NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Bugos, Glenn E.; Launius, Roger (Technical Monitor)
2000-01-01
Throughout Ames History, four themes prevail: a commitment to hiring the best people; cutting-edge research tools; project management that gets things done faster, better and cheaper; and outstanding research efforts that serve the scientific professions and the nation. More than any other NASA Center, Ames remains shaped by its origins in the NACA (National Advisory Committee for Aeronautics). Not that its missions remain the same. Sure, Ames still houses the world's greatest collection of wind tunnels and simulation facilities, its aerodynamicists remain among the best in the world, and pilots and engineers still come for advice on how to build better aircraft. But that is increasingly part of Ames' past. Ames people have embraced two other missions for its future. First, intelligent systems and information science will help NASA use new tools in supercomputing, networking, telepresence and robotics. Second, astrobiology will explore lore the prospects for life on Earth and beyond. Both new missions leverage Ames long-standing expertise in computation and in the life sciences, as well as its relations with the computing and biotechnology firms working in the Silicon Valley community that has sprung up around the Center. Rather than the NACA missions, it is the NACA culture that still permeates Ames. The Ames way of research management privileges the scientists and engineers working in the laboratories. They work in an atmosphere of freedom, laced with the expectation of integrity and responsibility. Ames researchers are free to define their research goals and define how they contribute to the national good. They are expected to keep their fingers on the pulse of their disciplines, to be ambitious yet frugal in organizing their efforts, and to always test their theories in the laboratory or in the field. Ames' leadership ranks, traditionally, are cultivated within this scientific community. Rather than manage and supervise these researchers, Ames leadership merely guided them, represents them to NASA headquarters and the world outside, then steps out of the way before they get run over.
NASA Astrophysics Data System (ADS)
Erickson, T. A.; Granger, B.; Grout, J.; Corlay, S.
2017-12-01
The volume of Earth science data gathered from satellites, aircraft, drones, and field instruments continues to increase. For many scientific questions in the Earth sciences, managing this large volume of data is a barrier to progress, as it is difficult to explore and analyze large volumes of data using the traditional paradigm of downloading datasets to a local computer for analysis. Furthermore, methods for communicating Earth science algorithms that operate on large datasets in an easily understandable and reproducible way are needed. Here we describe a system for developing, interacting, and sharing well-documented Earth Science algorithms that combines existing software components: Jupyter Notebook: An open-source, web-based environment that supports documents that combine code and computational results with text narrative, mathematics, images, and other media. These notebooks provide an environment for interactive exploration of data and development of well documented algorithms. Jupyter Widgets / ipyleaflet: An architecture for creating interactive user interface controls (such as sliders, text boxes, etc.) in Jupyter Notebooks that communicate with Python code. This architecture includes a default set of UI controls (sliders, dropboxes, etc.) as well as APIs for building custom UI controls. The ipyleaflet project is one example that offers a custom interactive map control that allows a user to display and manipulate geographic data within the Jupyter Notebook. Google Earth Engine: A cloud-based geospatial analysis platform that provides access to petabytes of Earth science data via a Python API. The combination of Jupyter Notebooks, Jupyter Widgets, ipyleaflet, and Google Earth Engine makes it possible to explore and analyze massive Earth science datasets via a web browser, in an environment suitable for interactive exploration, teaching, and sharing. Using these environments can make Earth science analyses easier to understand and reproducible, which may increase the rate of scientific discoveries and the transition of discoveries into real-world impacts.
Macmillan Encyclopedia of Chemistry (edited by Joseph J. Lagowski)
NASA Astrophysics Data System (ADS)
Kauffman, George B.
1998-11-01
Macmillan: New York, 1997. Four volumes. Figs., tables. lxxi + 1696 pp. 22.0 x 28.5 cm. $400. ISBN 0-02-897225-2. This latest addition to Macmillan's series of comprehensive core science encyclopedias (previous sets dealt with physics and earth sciences) will be of particular interest to readers of this Journal, for it is edited by longtime Journal of Chemical Education editor Joe Lagowski, assisted by a board of five distinguished associate editors. The attractively priced set offers clear explanations of the phenomena and concepts of chemistry and its materials, whether found in industry, the laboratory, or the natural world. It is intended for a broad spectrum of readers-professionals whose work draws on chemical concepts and knowledge (e.g., material scientists, engineers, health workers, biotechnologists, mathematicians, and computer programmers), science teachers at all levels from kindergarten to high school, high school and college students interested in medicine or the sciences, college and university professors, and laypersons desiring information on practical aspects of chemistry (e.g., household cleaning products, food and food additives, manufactured materials, herbicides, the human body, sweeteners, and animal communication).
A brief history of the most remarkable numbers e, i and γ in mathematical sciences with applications
NASA Astrophysics Data System (ADS)
Debnath, Lokenath
2015-08-01
This paper deals with a brief history of the most remarkable Euler numbers e, i and γ in mathematical sciences. Included are many properties of the constants e, i and γ and their applications in algebra, geometry, physics, chemistry, ecology, business and industry. Special attention is given to the growth and decay phenomena in many real-world problems including stability and instability of their solutions. Some specific and modern applications of logarithms, complex numbers and complex exponential functions to electrical circuits and mechanical systems are presented with examples. Included are the use of complex numbers and complex functions in the description and analysis of chaos and fractals with the aid of modern computer technology. In addition, the phasor method is described with examples of applications in engineering science. The major focus of this paper is to provide basic information through historical approach to mathematics teaching and learning of the fundamental knowledge and skills required for students and teachers at all levels so that they can understand the concepts of mathematics, and mathematics education in science and technology.
Impact of mobility structure on optimization of small-world networks of mobile agents
NASA Astrophysics Data System (ADS)
Lee, Eun; Holme, Petter
2016-06-01
In ad hoc wireless networking, units are connected to each other rather than to a central, fixed, infrastructure. Constructing and maintaining such networks create several trade-off problems between robustness, communication speed, power consumption, etc., that bridges engineering, computer science and the physics of complex systems. In this work, we address the role of mobility patterns of the agents on the optimal tuning of a small-world type network construction method. By this method, the network is updated periodically and held static between the updates. We investigate the optimal updating times for different scenarios of the movement of agents (modeling, for example, the fat-tailed trip distances, and periodicities, of human travel). We find that these mobility patterns affect the power consumption in non-trivial ways and discuss how these effects can best be handled.
Perspectives on Transportation. Teacher's Guide. Preparing for Tomorrow's World.
ERIC Educational Resources Information Center
Iozzi, Louis A.; And Others
"Perspectives on Transportation" is one of the "Preparing for Tomorrow's World" (PTW) program modules. PTW is an interdisciplinary, future-oriented program which incorporates information from the sciences and social sciences and addresses societal concerns which interface science/technology/society. The program promotes…
Dilemmas in Bioethics. Teacher's Guide. Preparing for Tomorrow's World.
ERIC Educational Resources Information Center
Iozzi, Louis A.
"Preparing for Tomorrow's World" is an interdisciplinary, future-oriented program which incorporates information from the sciences and social sciences and addresses societal concerns which interface science/technology/society. The program promotes responsible citizenry with increased abilities in critical thinking, problem-solving,…
ERIC Educational Resources Information Center
Maza, Paul; Miller, Allison; Carson, Brian; Hermanson, John
2018-01-01
Learning and retaining science content may be increased by applying the basic science material to real-world situations. Discussing cases with students during lectures and having them participate in laboratory exercises where they apply the science content to practical situations increases students' interest and enthusiasm. A summer course in…
NASA Astrophysics Data System (ADS)
Hill, C. N.; Schools, H.; Research Team Members
2012-12-01
This presentation will report on a classroom pilot study in which we teamed with school teachers in four middle school classes to develop and deploy course modules that connect the real-world to virtual forms of laboratory experiments.The broad goal is to help students realize that seemingly complex Earth system processes can be connected to basic properties of the planet and that this can be illustrated through idealized experiment. Specifically the presentation will describe virtual modules based on on-demand cloud computing technologies that allow students to test the notion that pole equator gradients in radiative forcing together with rotation can explain characteristic patterns of flow in the atmosphere. The module developed aligns with new Massachusetts science standard requirements regarding understanding of weather and climate processes. These new standards emphasize an appreciation of differential solar heating and a qualitative understanding of the significance of rotation. In our preliminary classroom pilot studies we employed pre and post evaluation tests to establish that the modules had increased student knowledge of phenomenology and terms. We will describe the results of these tests as well as results from anecdotal measures of student response. This pilot study suggests that one way to help make Earth science concepts more tractable to a wider audience is through virtual experiments that distill phenomena down, but still retain enough detail that students can see the connection to the real world. Modern computer technology and developments in research models appear to provide an opportunity for more work in this area. We will describe some follow-up possibilities that we envisage.
Internet Voice Distribution System (IVoDS) Utilization in Remote Payload Operations
NASA Technical Reports Server (NTRS)
Best, Susan; Bradford, Bob; Chamberlain, Jim; Nichols, Kelvin; Bailey, Darrell (Technical Monitor)
2002-01-01
Due to limited crew availability to support science and the large number of experiments to be operated simultaneously, telescience is key to a successful International Space Station (ISS) science program. Crew, operations personnel at NASA centers, and researchers at universities and companies around the world must work closely together to perform scientific experiments on-board ISS. NASA has initiated use of Voice over Internet Protocol (VoIP) to supplement the existing HVoDS mission voice communications system used by researchers. The Internet Voice Distribution System (IVoDS) connects researchers to mission support "loops" or conferences via Internet Protocol networks such as the high-speed Internet 2. Researchers use IVoDS software on personal computers to talk with operations personnel at NASA centers. IVoDS also has the capability, if authorized, to allow researchers to communicate with the ISS crew during experiment operations. NODS was developed by Marshall Space Flight Center with contractors A2 Technology, Inc. FVC, Lockheed- Martin, and VoIP Group. IVoDS is currently undergoing field-testing with full deployment for up to 50 simultaneous users expected in 2002. Research is currently being performed to take full advantage of the digital world - the Personal Computer and Internet Protocol networks - to qualitatively enhance communications among ISS operations personnel. In addition to the current voice capability, video and data-sharing capabilities are being investigated. Major obstacles being addressed include network bandwidth capacity and strict security requirements. Techniques being investigated to reduce and overcome these obstacles include emerging audio-video protocols and network technology including multicast and quality-of-service.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
78 FR 10180 - Annual Computational Science Symposium; Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...
World view analysis of knowledge in a rural village: Implications for science education
NASA Astrophysics Data System (ADS)
George, June
1999-01-01
This article describes an empirical qualitative analysis of some of the traditional practices and beliefs, with respect to health regimens and marine-related activities, which operate in the daily lives of people in the village of Seablast, Trinidad and Tobago. The purpose of the investigation was to gain an understanding of these practices and beliefs and the interpretive framework that underpins them, and to explore how these might impinge on the learning and teaching of school science in such a context. Kearney's world view theory was employed as the framework for the analysis. The investigation reveals that the traditional wisdom in Seablast is a pervasive system, consisting of several concepts and principles, some of which are similar to those of conventional science, whereas others differ significantly. There are also some similarities between the world view of the villagers and that of science. However, the procedures used by villagers to effect these tenets are often quite different from those employed in science. The article argues that science students and teachers who are exposed to the traditional wisdom and who have some level of commitment to it are likely to find that, to some extent, they are required to function in two worlds - the traditional world and the world of science. Current research suggests that the boundary crossing between these worlds may be difficult or even hazardous for some people. The recommendation is made that school science curricula for contexts such as Seablast must be fashioned from a cultural perspective, with an emphasis on providing aids for students to effect the boundary crossing successfully. This would put students in a better position to evaluate both their traditional practices and beliefs and conventional science so that they could make appropriate choices for the conduct of their lives. © John Wiley & Sons, Inc. Sci Ed 83:77-95, 1999.
NASA Astrophysics Data System (ADS)
Strayer, Michael
2009-07-01
Welcome to San Diego and the 2009 SciDAC conference. Over the next four days, I would like to present an assessment of the SciDAC program. We will look at where we've been, how we got to where we are and where we are going in the future. Our vision is to be first in computational science, to be best in class in modeling and simulation. When Ray Orbach asked me what I would do, in my job interview for the SciDAC Director position, I said we would achieve that vision. And with our collective dedicated efforts, we have managed to achieve this vision. In the last year, we have now the most powerful supercomputer for open science, Jaguar, the Cray XT system at the Oak Ridge Leadership Computing Facility (OLCF). We also have NERSC, probably the best-in-the-world program for productivity in science that the Office of Science so depends on. And the Argonne Leadership Computing Facility offers architectural diversity with its IBM Blue Gene/P system as a counterbalance to Oak Ridge. There is also ESnet, which is often understated—the 40 gigabit per second dual backbone ring that connects all the labs and many DOE sites. In the President's Recovery Act funding, there is exciting news that ESnet is going to build out to a 100 gigabit per second network using new optical technologies. This is very exciting news for simulations and large-scale scientific facilities. But as one noted SciDAC luminary said, it's not all about the computers—it's also about the science—and we are also achieving our vision in this area. Together with having the fastest supercomputer for science, at the SC08 conference, SciDAC researchers won two ACM Gordon Bell Prizes for the outstanding performance of their applications. The DCA++ code, which solves some very interesting problems in materials, achieved a sustained performance of 1.3 petaflops, an astounding result and a mark I suspect will last for some time. The LS3DF application for studying nanomaterials also required the development of a new and novel algorithm to produce results up to 400 times faster than a similar application, and was recognized with a prize for algorithm innovation—a remarkable achievement. Day one of our conference will include examples of petascale science enabled at the OLCF. Although Jaguar has not been officially commissioned, it has gone through its acceptance tests, and during its shakedown phase there have been pioneer applications used for the acceptance tests, and they are running at scale. These include applications in the areas of astrophysics, biology, chemistry, combustion, fusion, geosciences, materials science, nuclear energy and nuclear physics. We also have a whole compendium of science we do at our facilities; these have been documented and reviewed at our last SciDAC conference. Many of these were highlighted in our Breakthroughs Report. One session at this week's conference will feature a cross-section of these breakthroughs. In the area of scalable electromagnetic simulations, the Auxiliary-space Maxwell Solver (AMS) uses specialized finite element discretizations and multigrid-based techniques, which decompose the original problem into easier-to-solve subproblems. Congratulations to the mathematicians on this. Another application on the list of breakthroughs was the authentication of PETSc, which provides scalable solvers used in many DOE applications and has solved problems with over 3 billion unknowns and scaled to over 16,000 processors on DOE leadership-class computers. This is becoming a very versatile and useful toolkit to achieve performance at scale. With the announcement of SIAM's first class of Fellows, we are remarkably well represented. Of the group of 191, more than 40 of these Fellows are in the 'DOE space.' We are so delighted that SIAM has recognized them for their many achievements. In the coming months, we will illustrate our leadership in applied math and computer science by looking at our contributions in the areas of programming models, development and performance tools, math libraries, system software, collaboration, and visualization and data analytics. This is a large and diverse list of libraries. We have asked for two panels, one chaired by David Keyes and composed of many of the nation's leading mathematicians, to produce a report on the most significant accomplishments in applied mathematics over the last eight years, taking us back to the start of the SciDAC program. In addition, we have a similar panel in computer science to be chaired by Kathy Yelick. They are going to identify the computer science accomplishments of the past eight years. These accomplishments are difficult to get a handle on, and I'm looking forward to this report. We will also have a follow-on to our report on breakthroughs in computational science and this will also go back eight years, looking at the many accomplishments under the SciDAC and INCITE programs. This will be chaired by Tony Mezzacappa. So, where are we going in the SciDAC program? It might help to take a look at computational science and how it got started. I go back to Ken Wilson, who made the model and has written on computational science and computational science education. His model was thus: The computational scientist plays the role of the experimentalist, and the math and CS researchers play the role of theorists, and the computers themselves are the experimental apparatus. And that in simulation science, we are carrying out numerical experiments as to the nature of physical and biological sciences. Peter Lax, in the same time frame, developed a report on large-scale computing in science and engineering. Peter remarked, 'Perhaps the most important applications of scientific computing come not in the solution of old problems, but in the discovery of new phenomena through numerical experimentation.' And in the early years, I think the person who provided the most guidance, the most innovation and the most vision for where the future might lie was Ed Oliver. Ed Oliver died last year. Ed did a number of things in science. He had this personality where he knew exactly what to do, but he preferred to stay out of the limelight so that others could enjoy the fruits of his vision. We in the SciDAC program and ASCR Facilities are still enjoying the benefits of his vision. We will miss him. Twenty years after Ken Wilson, Ray Orbach laid out the fundamental premise for SciDAC in an interview that appeared in SciDAC Review: 'SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing. ' As you look at the Lax report from 1982, it talks about how 'Future significant improvements may have to come from architectures embodying parallel processing elements—perhaps several thousands of processors.' And it continues, 'esearch in languages, algorithms and numerical analysis will be crucial in learning to exploit these new architectures fully.' In the early '90s, Sterling, Messina and Smith developed a workshop report on petascale computing and concluded, 'A petaflops computer system will be feasible in two decades, or less, and rely in part on the continual advancement of the semiconductor industry both in speed enhancement and cost reduction through improved fabrication processes.' So they were not wrong, and today we are embarking on a forward look that is at a different scale, the exascale, going to 1018 flops. In 2007, Stevens, Simon and Zacharia chaired a series of town hall meetings looking at exascale computing, and in their report wrote, 'Exascale computer systems are expected to be technologically feasible within the next 15 years, or perhaps sooner. These systems will push the envelope in a number of important technologies: processor architecture, scale of multicore integration, power management and packaging.' The concept of computing on the Jaguar computer involves hundreds of thousands of cores, as do the IBM systems that are currently out there. So the scale of computing with systems with billions of processors is staggering to me, and I don't know how the software and math folks feel about it. We have now embarked on a road toward extreme scale computing. We have created a series of town hall meetings and we are now in the process of holding workshops that address what I call within the DOE speak 'the mission need,' or what is the scientific justification for computing at that scale. We are going to have a total of 13 workshops. The workshops on climate, high energy physics, nuclear physics, fusion, and nuclear energy have been held. The report from the workshop on climate is actually out and available, and the other reports are being completed. The upcoming workshops are on biology, materials, and chemistry; and workshops that engage science for nuclear security are a partnership between NNSA and ASCR. There are additional workshops on applied math, computer science, and architecture that are needed for computing at the exascale. These extreme scale workshops will provide the foundation in our office, the Office of Science, the NNSA and DOE, and we will engage the National Science Foundation and the Department of Defense as partners. We envision a 10-year program for an exascale initiative. It will be an integrated R&D program initially—you can think about five years for research and development—that would be in hardware, operating systems, file systems, networking and so on, as well as software for applications. Application software and the operating system and the hardware all need to be bundled in this period so that at the end the system will execute the science applications at scale. We also believe that this process will have to have considerable investment from the manufacturers and vendors to be successful. We have formed laboratory, university and industry working groups to start this process and formed a panel to look at where SciDAC needs to go to compute at the extreme scale, and we have formed an executive committee within the Office of Science and the NNSA to focus on these activities. We will have outreach to DoD in the next few months. We are anticipating a solicitation within the next two years in which we will compete this bundled R&D process. We don't know how we will incorporate SciDAC into extreme scale computing, but we do know there will be many challenges. And as we have shown over the years, we have the expertise and determination to surmount these challenges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations
ERIC Educational Resources Information Center
Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa
2013-01-01
The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
High school computer science education paves the way for higher education: the Israeli case
NASA Astrophysics Data System (ADS)
Armoni, Michal; Gal-Ezer, Judith
2014-07-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.
Technology and Changing Lifestyles. Teacher's Guide. Preparing for Tomorrow's World.
ERIC Educational Resources Information Center
Iozzi, Louis A.
"Technology and Changing Lifestyles" is one of the "Preparing for Tomorrow's World" (PTW) program modules. PTW is an interdisciplinary, future-oriented program incorporating information from the sciences and social sciences and addressing societal concerns which interface science/technology/society. The program promotes…
Future Scenarios in Communications. Teacher's Guide. Preparing for Tomorrow's World.
ERIC Educational Resources Information Center
Iozzi, Louis A.; And Others
"Future Scenarios in Communications" is one of the "Preparing for Tomorrow's World" (PTW) program modules. PTW is an interdisciplinary, future-oriented program incorporating information from the sciences and social sciences and addressing societal concerns which interface science/technology/society. The program promotes…
People and Environmental Changes. Teacher's Guide. Preparing for Tomorrow's World.
ERIC Educational Resources Information Center
Iozzi, Louis A.
"People and Environmental Changes" is one of the "Preparing for Tomorrow's World" (PTW) program modules. PTW is an interdisciplinary, future-oriented program which incorporates information from the sciences and social sciences and addresses societal concerns which interface science/technology/society. The program promotes…
Using Science and the Internet as Everyday Classroom Tools
NASA Technical Reports Server (NTRS)
Mandel, Eric
1999-01-01
The Everyday Classroom Tools project developed a K-6 inquiry-based curriculum to bring the tools of scientific inquiry, together with the Internet, into the elementary school classroom. Our curriculum encourages students and teachers to experience the adventure of science through investigation of the world around us. In this project, experts in computer science and astronomy at SAO worked closely with teachers and students in Massachusetts elementary schools to design and model activities which are developmentally appropriate, fulfill the needs of the curriculum standards of the school district, and provide students with a chance to experience for themselves the joy and excitement of scientific inquiry. The results of our efforts are embodied in the Threads of Inquiry, a series of free-flowing dialogues about inquiry-inspiring investigations that maintain a solid connection with our experience and with one another. These investigations are concerned with topics such as the motion of the Earth, shadows, light, and time. Our work emphasizes a direct hands-on approach through concrete experience, rather than memorization of facts.
An outline of object-oriented philosophy.
Harman, Graham
2013-01-01
This article summarises the principles of object-oriented philosophy and explains its similarities with, and differences from, the outlook of the natural sciences. Like science, the object-oriented position avoids the notion (quite common in philosophy) that the human-world relation is the ground of all others, such that scientific statements about the world would only be statements about the world as it is for humans. But unlike science, object-oriented metaphysics treats artificial, social, and fictional entities in the same way as natural ones, and also holds that the world can only be known allusively rather than directly.
Emerging Nanophotonic Applications Explored with Advanced Scientific Parallel Computing
NASA Astrophysics Data System (ADS)
Meng, Xiang
The domain of nanoscale optical science and technology is a combination of the classical world of electromagnetics and the quantum mechanical regime of atoms and molecules. Recent advancements in fabrication technology allows the optical structures to be scaled down to nanoscale size or even to the atomic level, which are far smaller than the wavelength they are designed for. These nanostructures can have unique, controllable, and tunable optical properties and their interactions with quantum materials can have important near-field and far-field optical response. Undoubtedly, these optical properties can have many important applications, ranging from the efficient and tunable light sources, detectors, filters, modulators, high-speed all-optical switches; to the next-generation classical and quantum computation, and biophotonic medical sensors. This emerging research of nanoscience, known as nanophotonics, is a highly interdisciplinary field requiring expertise in materials science, physics, electrical engineering, and scientific computing, modeling and simulation. It has also become an important research field for investigating the science and engineering of light-matter interactions that take place on wavelength and subwavelength scales where the nature of the nanostructured matter controls the interactions. In addition, the fast advancements in the computing capabilities, such as parallel computing, also become as a critical element for investigating advanced nanophotonic devices. This role has taken on even greater urgency with the scale-down of device dimensions, and the design for these devices require extensive memory and extremely long core hours. Thus distributed computing platforms associated with parallel computing are required for faster designs processes. Scientific parallel computing constructs mathematical models and quantitative analysis techniques, and uses the computing machines to analyze and solve otherwise intractable scientific challenges. In particular, parallel computing are forms of computation operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently. In this dissertation, we report a series of new nanophotonic developments using the advanced parallel computing techniques. The applications include the structure optimizations at the nanoscale to control both the electromagnetic response of materials, and to manipulate nanoscale structures for enhanced field concentration, which enable breakthroughs in imaging, sensing systems (chapter 3 and 4) and improve the spatial-temporal resolutions of spectroscopies (chapter 5). We also report the investigations on the confinement study of optical-matter interactions at the quantum mechanical regime, where the size-dependent novel properties enhanced a wide range of technologies from the tunable and efficient light sources, detectors, to other nanophotonic elements with enhanced functionality (chapter 6 and 7).
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
. Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Bernier, David
2011-01-01
Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
NASA Astrophysics Data System (ADS)
Koch, Melissa; Gorges, Torie
2016-10-01
Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.
The NASA computer science research program plan
NASA Technical Reports Server (NTRS)
1983-01-01
A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.
Creating Hybrid Spaces for Engaging School Science among Urban Middle School Girls
ERIC Educational Resources Information Center
Barton, Angela Calabrese; Tan, Edna; Rivet, Ann
2008-01-01
The middle grades are a crucial time for girls in making decisions about how or if they want to follow science trajectories. In this article, the authors report on how urban middle school girls enact meaningful strategies of engagement in science class in their efforts to merge their social worlds with the worlds of school science and on the…
Of Animals, Nature and People. Teacher's Guide. Preparing for Tomorrow's World.
ERIC Educational Resources Information Center
Iozzi, Louis A.; And Others
"Of Animals, Nature and People" is one of the "Preparing for Tomorrow's World" (PTW) program modules. PTW is an interdisciplinary, future-oriented program incorporating information from the sciences and social sciences and addressing societal concerns which interface science/technology/society. The program promotes responsible…
ERIC Educational Resources Information Center
Kilbourn, Brent
The purpose of this study is to develop and demonstrate the use of a conceptual framework for assessing the potential of "world view" as a concept for understanding important issues in science education. The framework is based on Stephen C. Pepper's treatment of six world hypotheses (animism, mysticism, formism, mechansim, contextualism, and…
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
1988-07-08
Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36
Empirical Determination of Competence Areas to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia
2014-01-01
The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…
Factors Influencing Exemplary Science Teachers' Levels of Computer Use
ERIC Educational Resources Information Center
Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen
2011-01-01
The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2005-01-01
Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.
Nutritional Translation Blended With Food Science: 21st Century Applications1234
Ferruzzi, Mario G.; Peterson, Devin G.; Singh, R. Paul; Schwartz, Steven J.; Freedman, Marjorie R.
2012-01-01
This paper, based on the symposium “Real-World Nutritional Translation Blended With Food Science,” describes how an integrated “farm-to-cell” approach would create the framework necessary to address pressing public health issues. The paper describes current research that examines chemical reactions that may influence food flavor (and ultimately food consumption) and posits how these reactions can be used in health promotion; it explains how mechanical engineering and computer modeling can study digestive processes and provide better understanding of how physical properties of food influence nutrient bioavailability and posits how this research can also be used in the fight against obesity and diabetes; and it illustrates how an interdisciplinary scientific collaboration led to the development of a novel functional food that may be used clinically in the prevention and treatment of prostate cancer. PMID:23153735
King, Gary; Pan, Jennifer; Roberts, Margaret E
2014-08-22
Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored. Copyright © 2014, American Association for the Advancement of Science.
Teaching and Learning in the Mixed-Reality Science Classroom
NASA Astrophysics Data System (ADS)
Tolentino, Lisa; Birchfield, David; Megowan-Romanowicz, Colleen; Johnson-Glenberg, Mina C.; Kelliher, Aisling; Martinez, Christopher
2009-12-01
As emerging technologies become increasingly inexpensive and robust, there is an exciting opportunity to move beyond general purpose computing platforms to realize a new generation of K-12 technology-based learning environments. Mixed-reality technologies integrate real world components with interactive digital media to offer new potential to combine best practices in traditional science learning with the powerful affordances of audio/visual simulations. This paper introduces the realization of a learning environment called SMALLab, the Situated Multimedia Arts Learning Laboratory. We present a recent teaching experiment for high school chemistry students. A mix of qualitative and quantitative research documents the efficacy of this approach for students and teachers. We conclude that mixed-reality learning is viable in mainstream high school classrooms and that students can achieve significant learning gains when this technology is co-designed with educators.
Pervasive Computing Goes to School
ERIC Educational Resources Information Center
Plymale, William O.
2005-01-01
In 1991 Mark Weiser introduced the idea of ubiquitous computing: a world in which computers and associated technologies become invisible, and thus indistinguishable from everyday life. This invisible computing is accomplished by means of "embodied virtuality," the process of drawing computers into the physical world. Weiser proposed that…
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2005-12-01
The needs for Earth science education to prepare students as globally-trained geoscience workforce increase tremendously with globalization of the economy. However, current academic programs often have difficulties in providing students world-view training or experiences with global context due to lack of resources and suitable teaching technology. This paper presents a NASA funded project with insights and solutions to this problem. The project aims to establish a geospatial data-rich learning and research environment that enable the students, faculty and researchers from institutes all over the world easily accessing, analyzing and modeling with the huge amount of NASA EOS data just like they possess those vast resources locally at their desktops. With the environment, classroom demonstration and training for students to deal with global climate and environment issues for any part of the world are possible in any classroom with Internet connection. Globalization and mobilization of Earth science education can be truly realized through the environment. This project, named as NASA EOS Higher Education Alliance: Mobilization of NASA EOS Data and Information through Web Services and Knowledge Management Technologies for Higher Education Teaching and Research, is built on profound technology and infrastructure foundations including web service technology, NASA EOS data resources, and open interoperability standards. An open, distributed, standard compliant, interoperable web-based system, called GeoBrain, is being developed by this project to provide a data-rich on-line learning and research environment. The system allows users to dynamically and collaboratively develop interoperable, web-executable geospatial process and analysis modules and models, and run them on-line against any part of the peta-byte archives for getting back the customized information products rather than raw data. The system makes a data-rich globally-capable Earth science learning and research environment, backed by NASA EOS data and computing resources that are unavailable to students and professors before, available to them at their desktops free of charge. In order to efficiently integrate this new environment into Earth science education and research, a NASA EOS Higher Education Alliance (NEHEA) is formed. The core members of NEHEA consist of the GeoBrain development team led by LAITS at George Mason University and a group of Earth science educators selected from an open RFP process. NEHEA is an open and free alliance. NEHEA welcomes Earth science educators around the world to join as associate members. NEHEA promotes international research and education collaborations in Earth science. NEHEA core members will provide technical support to NEHEA associate members for incorporating the data-rich learning environment into their teaching and research activities. The responsibilities of NEHEA education members include using the system in their research and teaching, providing feedback and requirements to the development team, exchanging information on the utilization of the system capabilities, participating in the system development, and developing new curriculums and research around the environment provided by GeoBrain.
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
Evaluating the Efficacy of the Cloud for Cluster Computation
NASA Technical Reports Server (NTRS)
Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom
2012-01-01
Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.
Indigenous Knowledge in the Sciences and a Practical Application in the Super Saturday Project.
ERIC Educational Resources Information Center
Settee, Priscilla
This paper reviews books and research papers concerned with Indigenous science knowledge and its integration into school curricula and describes current efforts to bridge Western and Native science. "A Yupiaq World View: Implications for Cultural, Educational and Technological Adaptation in a Contemporary World" (Angayuqaq Oscar…
Undergraduate Students' Science-Related Ideas as Embedded in Their Environmental Worldviews
ERIC Educational Resources Information Center
Liu, Shu-Chiu; Lin, Huann-shyang
2014-01-01
This study explored environmental worldviews of selected undergraduate students in Taiwan and located the associations of these worldviews with science. The "environment" is represented as nature or the natural world, as opposed to the social and spiritual world. The participants were undergraduate students (14 science and 15 nonscience…
Teaching Science in a Multicultural World.
ERIC Educational Resources Information Center
Offutt, Elizabeth Rhodes
This book is designed to be a source of ideas and motivation to encourage curiosity in children, provide opportunities to develop scientific processing skills, find out about cultures around the world, and explore science concepts. This resource incorporates multicultural literature and approaches into the teaching of science concepts in the…
Collecting behavioural data using the world wide web: considerations for researchers
Rhodes, S; Bowie, D; Hergenrather, K
2003-01-01
Objective: To identify and describe advantages, challenges, and ethical considerations of web based behavioural data collection. Methods: This discussion is based on the authors' experiences in survey development and study design, respondent recruitment, and internet research, and on the experiences of others as found in the literature. Results: The advantages of using the world wide web to collect behavioural data include rapid access to numerous potential respondents and previously hidden populations, respondent openness and full participation, opportunities for student research, and reduced research costs. Challenges identified include issues related to sampling and sample representativeness, competition for the attention of respondents, and potential limitations resulting from the much cited "digital divide", literacy, and disability. Ethical considerations include anonymity and privacy, providing and substantiating informed consent, and potential risks of malfeasance. Conclusions: Computer mediated communications, including electronic mail, the world wide web, and interactive programs will play an ever increasing part in the future of behavioural science research. Justifiable concerns regarding the use of the world wide web in research exist, but as access to, and use of, the internet becomes more widely and representatively distributed globally, the world wide web will become more applicable. In fact, the world wide web may be the only research tool able to reach some previously hidden population subgroups. Furthermore, many of the criticisms of online data collection are common to other survey research methodologies. PMID:12490652
Collecting behavioural data using the world wide web: considerations for researchers.
Rhodes, S D; Bowie, D A; Hergenrather, K C
2003-01-01
To identify and describe advantages, challenges, and ethical considerations of web based behavioural data collection. This discussion is based on the authors' experiences in survey development and study design, respondent recruitment, and internet research, and on the experiences of others as found in the literature. The advantages of using the world wide web to collect behavioural data include rapid access to numerous potential respondents and previously hidden populations, respondent openness and full participation, opportunities for student research, and reduced research costs. Challenges identified include issues related to sampling and sample representativeness, competition for the attention of respondents, and potential limitations resulting from the much cited "digital divide", literacy, and disability. Ethical considerations include anonymity and privacy, providing and substantiating informed consent, and potential risks of malfeasance. Computer mediated communications, including electronic mail, the world wide web, and interactive programs will play an ever increasing part in the future of behavioural science research. Justifiable concerns regarding the use of the world wide web in research exist, but as access to, and use of, the internet becomes more widely and representatively distributed globally, the world wide web will become more applicable. In fact, the world wide web may be the only research tool able to reach some previously hidden population subgroups. Furthermore, many of the criticisms of online data collection are common to other survey research methodologies.
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender
NASA Astrophysics Data System (ADS)
Larsen, Elizabeth A.; Stubbs, Margaret L.
Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.
An open-source textbook for teaching climate-related risk analysis using the R computing environment
NASA Astrophysics Data System (ADS)
Applegate, P. J.; Keller, K.
2015-12-01
Greenhouse gas emissions lead to increased surface air temperatures and sea level rise. In turn, sea level rise increases the risks of flooding for people living near the world's coastlines. Our own research on assessing sea level rise-related risks emphasizes both Earth science and statistics. At the same time, the free, open-source computing environment R is growing in popularity among statisticians and scientists due to its flexibility and graphics capabilities, as well as its large library of existing functions. We have developed a set of laboratory exercises that introduce students to the Earth science and statistical concepts needed for assessing the risks presented by climate change, particularly sea-level rise. These exercises will be published as a free, open-source textbook on the Web. Each exercise begins with a description of the Earth science and/or statistical concepts that the exercise teaches, with references to key journal articles where appropriate. Next, students are asked to examine in detail a piece of existing R code, and the exercise text provides a clear explanation of how the code works. Finally, students are asked to modify the existing code to produce a well-defined outcome. We discuss our experiences in developing the exercises over two separate semesters at Penn State, plus using R Markdown to interweave explanatory text with sample code and figures in the textbook.
ERIC Educational Resources Information Center
Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu
2013-01-01
With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…
NeuroMind: Past, present, and future.
Kubben, Pieter L
2017-01-01
This narrative report describes the underlying rationale and technical developments of NeuroMind, a mobile clinical decision support system for neurosurgery. From the perspective of a neurosurgeon - (app) developer it explains how technical progress has shaped the world's "most rated and highest rated" neurosurgical mobile application, with particular attention for operating system diversity on mobile hardware, cookbook medicine, regulatory affairs (in particular regarding software as a medical device), and new developments in the field of clinical data science, machine learning, and predictive analytics. Finally, the concept of "computational neurosurgery" is introduced as a vehicle to reach new horizons in neurosurgery.
NeuroMind: Past, present, and future
Kubben, Pieter L.
2017-01-01
This narrative report describes the underlying rationale and technical developments of NeuroMind, a mobile clinical decision support system for neurosurgery. From the perspective of a neurosurgeon – (app) developer it explains how technical progress has shaped the world's “most rated and highest rated” neurosurgical mobile application, with particular attention for operating system diversity on mobile hardware, cookbook medicine, regulatory affairs (in particular regarding software as a medical device), and new developments in the field of clinical data science, machine learning, and predictive analytics. Finally, the concept of “computational neurosurgery” is introduced as a vehicle to reach new horizons in neurosurgery. PMID:28966822
The quiet revolution of numerical weather prediction.
Bauer, Peter; Thorpe, Alan; Brunet, Gilbert
2015-09-03
Advances in numerical weather prediction represent a quiet revolution because they have resulted from a steady accumulation of scientific knowledge and technological advances over many years that, with only a few exceptions, have not been associated with the aura of fundamental physics breakthroughs. Nonetheless, the impact of numerical weather prediction is among the greatest of any area of physical science. As a computational problem, global weather prediction is comparable to the simulation of the human brain and of the evolution of the early Universe, and it is performed every day at major operational centres across the world.
Computer Science and the Liberal Arts
ERIC Educational Resources Information Center
Shannon, Christine
2010-01-01
Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…
Marrying Content and Process in Computer Science Education
ERIC Educational Resources Information Center
Zendler, A.; Spannagel, C.; Klaudt, D.
2011-01-01
Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…
ERIC Educational Resources Information Center
Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.
2016-01-01
Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…
Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University
ERIC Educational Resources Information Center
Plane, Jandelyn
2010-01-01
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…
Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.
ERIC Educational Resources Information Center
Turner, Judith Axler
1987-01-01
Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
2009-12-07
The word 'breakthrough' aptly describes the transformational science and milestones achieved at the Argonne Leadership Computing Facility (ALCF) throughout 2008. The number of research endeavors undertaken at the ALCF through the U.S. Department of Energy's (DOE) Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program grew from 9 in 2007 to 20 in 2008. The allocation of computer time awarded to researchers on the Blue Gene/P also spiked significantly - from nearly 10 million processor hours in 2007 to 111 million in 2008. To support this research, we expanded the capabilities of Intrepid, an IBM Blue Gene/P systemmore » at the ALCF, to 557 teraflops (TF) for production use. Furthermore, we enabled breakthrough levels of productivity and capability in visualization and data analysis with Eureka, a powerful installation of NVIDIA Quadro Plex S4 external graphics processing units. Eureka delivered a quantum leap in visual compute density, providing more than 111 TF and more than 3.2 terabytes of RAM. On April 21, 2008, the dedication of the ALCF realized DOE's vision to bring the power of the Department's high performance computing to open scientific research. In June, the IBM Blue Gene/P supercomputer at the ALCF debuted as the world's fastest for open science and third fastest overall. No question that the science benefited from this growth and system improvement. Four research projects spearheaded by Argonne National Laboratory computer scientists and ALCF users were named to the list of top ten scientific accomplishments supported by DOE's Advanced Scientific Computing Research (ASCR) program. Three of the top ten projects used extensive grants of computing time on the ALCF's Blue Gene/P to model the molecular basis of Parkinson's disease, design proteins at atomic scale, and create enzymes. As the year came to a close, the ALCF was recognized with several prestigious awards at SC08 in November. We provided resources for Linear Scaling Divide-and-Conquer Electronic Structure Calculations for Thousand Atom Nanostructures, a collaborative effort between Argonne, Lawrence Berkeley National Laboratory, and Oak Ridge National Laboratory that received the ACM Gordon Bell Prize Special Award for Algorithmic Innovation. The ALCF also was named a winner in two of the four categories in the HPC Challenge best performance benchmark competition.« less
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
African-American males in computer science---Examining the pipeline for clogs
NASA Astrophysics Data System (ADS)
Stone, Daryl Bryant
The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.
Enabling the transition towards Earth Observation Science 2.0
NASA Astrophysics Data System (ADS)
Mathieu, Pierre-Philippe; Desnos, Yves-Louis
2015-04-01
Science 2.0 refers to the rapid and systematic changes in doing Research and organising Science driven by the rapid advances in ICT and digital technologies combined with a growing demand to do Science for Society (actionable research) and in Society (co-design of knowledge). Nowadays, teams of researchers around the world can easily access a wide range of open data across disciplines and remotely process them on the Cloud, combining them with their own data to generate knowledge, develop information products for societal applications, and tackle complex integrative complex problems that could not be addressed a few years ago. Such rapid exchange of digital data is fostering a new world of data-intensive research, characterized by openness, transparency, and scrutiny and traceability of results, access to large volume of complex data, availability of community open tools, unprecedented level of computing power, and new collaboration among researchers and new actors such as citizen scientists. The EO scientific community is now facing the challenge of responding to this new paradigm in science 2.0 in order to make the most of the large volume of complex and diverse data delivered by the new generation of EO missions, and in particular the Sentinels. In this context, ESA - in particular within the framework of the Scientific Exploitation of Operational Missions (SEOM) element - is supporting a variety of activities in partnership with research communities to ease the transition and make the most of the data. These include the generation of new open tools and exploitation platforms, exploring new ways to exploit data on cloud-based platforms, dissiminate data, building new partnership with citizen scientists, and training the new generation of data scientists. The paper will give a brief overview of some of ESA activities aiming to facilitate the exploitation of large amount of data from EO missions in a collaborative, cross-disciplinary, and open way, from science to applications and education.
ERIC Educational Resources Information Center
Wolfle, Dael
This book recounts the many challenges and successes achieved by the American Association for the Advancement of Science (AAAS) from World War II to 1970. Included are: (1) the development of the National Science Foundation; (2) Cold War concerns about the loyalty and freedom of scientists; (3) efforts to develop an effective science curriculum…
Science Students Creating Hybrid Spaces When Engaging in an Expo Investigation Project
ERIC Educational Resources Information Center
Ramnarain, Umesh; de Beer, Josef
2013-01-01
In this paper, we report on the experiences of three 9th-grade South African students (13-14 years) in doing open science investigation projects for a science expo. A particular focus of this study was the manner in which these students merge the world of school science with their social world to create a hybrid space by appropriating knowledge…
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
Computational imaging of light in flight
NASA Astrophysics Data System (ADS)
Hullin, Matthias B.
2014-10-01
Many computer vision tasks are hindered by image formation itself, a process that is governed by the so-called plenoptic integral. By averaging light falling into the lens over space, angle, wavelength and time, a great deal of information is irreversibly lost. The emerging idea of transient imaging operates on a time resolution fast enough to resolve non-stationary light distributions in real-world scenes. It enables the discrimination of light contributions by the optical path length from light source to receiver, a dimension unavailable in mainstream imaging to date. Until recently, such measurements used to require high-end optical equipment and could only be acquired under extremely restricted lab conditions. To address this challenge, we introduced a family of computational imaging techniques operating on standard time-of-flight image sensors, for the first time allowing the user to "film" light in flight in an affordable, practical and portable way. Just as impulse responses have proven a valuable tool in almost every branch of science and engineering, we expect light-in-flight analysis to impact a wide variety of applications in computer vision and beyond.
Modeling emergent border-crossing behaviors during pandemics
NASA Astrophysics Data System (ADS)
Santos, Eunice E.; Santos, Eugene; Korah, John; Thompson, Jeremy E.; Gu, Qi; Kim, Keum Joo; Li, Deqing; Russell, Jacob; Subramanian, Suresh; Zhang, Yuxi; Zhao, Yan
2013-06-01
Modeling real-world scenarios is a challenge for traditional social science researchers, as it is often hard to capture the intricacies and dynamisms of real-world situations without making simplistic assumptions. This imposes severe limitations on the capabilities of such models and frameworks. Complex population dynamics during natural disasters such as pandemics is an area where computational social science can provide useful insights and explanations. In this paper, we employ a novel intent-driven modeling paradigm for such real-world scenarios by causally mapping beliefs, goals, and actions of individuals and groups to overall behavior using a probabilistic representation called Bayesian Knowledge Bases (BKBs). To validate our framework we examine emergent behavior occurring near a national border during pandemics, specifically the 2009 H1N1 pandemic in Mexico. The novelty of the work in this paper lies in representing the dynamism at multiple scales by including both coarse-grained (events at the national level) and finegrained (events at two separate border locations) information. This is especially useful for analysts in disaster management and first responder organizations who need to be able to understand both macro-level behavior and changes in the immediate vicinity, to help with planning, prevention, and mitigation. We demonstrate the capabilities of our framework in uncovering previously hidden connections and explanations by comparing independent models of the border locations with their fused model to identify emergent behaviors not found in either independent location models nor in a simple linear combination of those models.
NASA Astrophysics Data System (ADS)
Heck, A.; Madsen, C.
2003-07-01
Astronomers communicate all the time, with colleagues of course, but also with managers and administrators, with decision makers and takers, with social representatives, with the news media, and with the society at large. Education is naturally part of the process. Astronomy communication must take into account several specificities: the astronomy community is rather compact and well organized world-wide; astronomy has penetrated the general public remarkably well with an extensive network of associations and organizations of aficionados all over the world. Also, as a result of the huge amount of data accumulated and by necessity for their extensive international collaborations, astronomers have pioneered the development of distributed resources, electronic communications and networks coupled to advanced methodologies and technologies, often much before they become of common world-wide usage. This book is filling up a gap in the astronomy-related literature by providing a set of chapters not only of direct interest to astronomy communication, but also well beyond it. The experts contributing to this book have done their best to write in a way understandable to readers not necessarily hyperspecialized in astronomy nor in communication techniques while providing specific detailed information, as well as plenty of pointers and bibliographic elements. This book will be very useful for researchers, teachers, editors, publishers, librarians, computer scientists, sociologists of science, research planners and strategists, project managers, public-relations officers, plus those in charge of astronomy-related organizations, as well as for students aiming at a career in astronomy or related space science. Link: http://www.wkap.nl/prod/b/1-4020-1345-0
The World Needs a New Curriculum
ERIC Educational Resources Information Center
Prensky, Marc
2014-01-01
The author proposes that today's existing, world-wide curriculum--based on offering roughly the same math, language arts, science, and social studies to all--is not what is required for the future, and is hurting rather than helping the world's students. Math, language arts, science, and social studies, he argues, are really "proxies"…
Between Faith and Science: World Culture Theory and Comparative Education
ERIC Educational Resources Information Center
Carney, Stephen; Rappleye, Jeremy; Silova, Iveta
2012-01-01
World culture theory seeks to explain an apparent convergence of education through a neoinstitutionalist lens, seeing global rationalization in education as driven by the logic of science and the myth of progress. While critics have challenged these assumptions by focusing on local manifestations of world-level tendencies, such critique is…
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
Computer Science and Telecommunications Board summary of activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blumenthal, M.S.
1992-03-27
The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.
Research in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
An Integrated Approach to Engineering Education in a Minority Community
NASA Technical Reports Server (NTRS)
Taylor, Bill
1998-01-01
Northeastern New Mexico epitomizes regions which are economically depressed, rural, and predominantly Hispanic. New Mexico Highlands University (NMHU), with a small student population of approximately 2800, offers a familiar environment attracting students who might otherwise not attend college. An outreach computer network of minority schools was created in northeastern New Mexico with NASA funding. Rural and urban minority schools gained electronic access to each other, to computer resources, to technical help at New Mexico Highlands University and gained access to the world via the Internet. This outreach program was initiated in the fall of 1992 in an effort to attract and to involve minority students in Engineering and the Mathematical Sciences. We installed 56 Kbs Internet connections to eight elementary schools, two middle schools, two high schools, a public library (servicing the home schooling community) and an International Baccalaureate school. For another fourteen rural schools, we provided computers and free dial-up service to servers on the New Mexico Highlands University campus.
The HEP Software and Computing Knowledge Base
NASA Astrophysics Data System (ADS)
Wenaus, T.
2017-10-01
HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.
Computer-aided discovery of a metal-organic framework with superior oxygen uptake.
Moghadam, Peyman Z; Islamoglu, Timur; Goswami, Subhadip; Exley, Jason; Fantham, Marcus; Kaminski, Clemens F; Snurr, Randall Q; Farha, Omar K; Fairen-Jimenez, David
2018-04-11
Current advances in materials science have resulted in the rapid emergence of thousands of functional adsorbent materials in recent years. This clearly creates multiple opportunities for their potential application, but it also creates the following challenge: how does one identify the most promising structures, among the thousands of possibilities, for a particular application? Here, we present a case of computer-aided material discovery, in which we complete the full cycle from computational screening of metal-organic framework materials for oxygen storage, to identification, synthesis and measurement of oxygen adsorption in the top-ranked structure. We introduce an interactive visualization concept to analyze over 1000 unique structure-property plots in five dimensions and delimit the relationships between structural properties and oxygen adsorption performance at different pressures for 2932 already-synthesized structures. We also report a world-record holding material for oxygen storage, UMCM-152, which delivers 22.5% more oxygen than the best known material to date, to the best of our knowledge.
NASA Astrophysics Data System (ADS)
Smith, M. A.; Preston, L.; Graham, K.
2007-12-01
Partnering science graduate students with high school teachers in their classroom is a mutually beneficial relationship. Graduate students who may become future university level faculty are exposed to teaching, classroom management, outreach scholarship, and managing time between teaching and research. Teachers benefit by having ready access to knowledgeable scientists, a link to university resources, and an additional adult in the classroom. Partnerships in Research Opportunities to Benefit Education (PROBE), a recent NSF funded GK-12 initiative, formed partnerships between science and math graduate students from the University of New Hampshire (UNH) and local high school science teachers. A primary goal of this program was to promote inquiry-based science lessons. The teacher-graduate student teams worked together approximately twenty hours per week on researching, preparing, and implementing new lessons and supervising student-led projects. Several new inquiry-based activities in Geology and Astronomy were developed as a result of collaboration between an Earth Science graduate student and high school teacher. For example, a "fishbowl" activity was very successful in sparking a classroom discussion about how minerals are used in industrial materials. The class then went on to research how to make their own paint using minerals. This activity provided a capstone project at the end of the unit about minerals, and made real world connections to the subject. A more involved geology lesson was developed focusing on the currently popular interest in forensics. Students were assigned with researching how geology can play an important part in solving a crime. When they understood the role of geologic concepts within the scope of the forensic world, they used techniques to solve their own "crime". Astronomy students were responsible for hosting and teaching middle school students about constellations, using a star- finder, and operating an interactive planetarium computer program. In order to successfully convey this information to the younger students, the high school students had to learn their material well. This model of pairing graduate students with science teachers is continuing as a component of the Transforming Earth System Science Education (TESSE) program.
Enhancing implementation science by applying best principles of systems science.
Northridge, Mary E; Metcalf, Sara S
2016-10-04
Implementation science holds promise for better ensuring that research is translated into evidence-based policy and practice, but interventions often fail or even worsen the problems they are intended to solve due to a lack of understanding of real world structures and dynamic complexity. While systems science alone cannot possibly solve the major challenges in public health, systems-based approaches may contribute to changing the language and methods for conceptualising and acting within complex systems. The overarching goal of this paper is to improve the modelling used in dissemination and implementation research by applying best principles of systems science. Best principles, as distinct from the more customary term 'best practices', are used to underscore the need to extract the core issues from the context in which they are embedded in order to better ensure that they are transferable across settings. Toward meaningfully grappling with the complex and challenging problems faced in adopting and integrating evidence-based health interventions and changing practice patterns within specific settings, we propose and illustrate four best principles derived from our systems science experience: (1) model the problem, not the system; (2) pay attention to what is important, not just what is quantifiable; (3) leverage the utility of models as boundary objects; and (4) adopt a portfolio approach to model building. To improve our mental models of the real world, system scientists have created methodologies such as system dynamics, agent-based modelling, geographic information science and social network simulation. To understand dynamic complexity, we need the ability to simulate. Otherwise, our understanding will be limited. The practice of dynamic systems modelling, as discussed herein, is the art and science of linking system structure to behaviour for the purpose of changing structure to improve behaviour. A useful computer model creates a knowledge repository and a virtual library for internally consistent exploration of alternative assumptions. Among the benefits of systems modelling are iterative practice, participatory potential and possibility thinking. We trust that the best principles proposed here will resonate with implementation scientists; applying them to the modelling process may abet the translation of research into effective policy and practice.
Knowing when to give up: early-rejection stratagems in ligand docking
NASA Astrophysics Data System (ADS)
Skone, Gwyn; Voiculescu, Irina; Cameron, Stephen
2009-10-01
Virtual screening is an important resource in the drug discovery community, of which protein-ligand docking is a significant part. Much software has been developed for this purpose, largely by biochemists and those in related disciplines, who pursue ever more accurate representations of molecular interactions. The resulting tools, however, are very processor-intensive. This paper describes some initial results from a project to review computational chemistry techniques for docking from a non-chemistry standpoint. An abstract blueprint for protein-ligand docking using empirical scoring functions is suggested, and this is used to discuss potential improvements. By introducing computer science tactics such as lazy function evaluation, dramatic increases to throughput can and have been realized using a real-world docking program. Naturally, they can be extended to any system that approximately corresponds to the architecture outlined.
NASA Astrophysics Data System (ADS)
Horstemeyer, M. F.
This review of multiscale modeling covers a brief history of various multiscale methodologies related to solid materials and the associated experimental influences, the various influence of multiscale modeling on different disciplines, and some examples of multiscale modeling in the design of structural components. Although computational multiscale modeling methodologies have been developed in the late twentieth century, the fundamental notions of multiscale modeling have been around since da Vinci studied different sizes of ropes. The recent rapid growth in multiscale modeling is the result of the confluence of parallel computing power, experimental capabilities to characterize structure-property relations down to the atomic level, and theories that admit multiple length scales. The ubiquitous research that focus on multiscale modeling has broached different disciplines (solid mechanics, fluid mechanics, materials science, physics, mathematics, biological, and chemistry), different regions of the world (most continents), and different length scales (from atoms to autos).